Preparing For SES Chicago
Published 20 years, 1 week pastSome of you may recall that a while back, I let my mouth run sarcastic in the direction of some SEO experts, a topic conference, and by implication an entire industry… and ended up publicly apologizing for same, when it turned out I’d been, if not libelous, then at the very least grossly unfair. In the process, Danny Sullivan of Search Engine Watch, the guy who organized the conference I was maligning, floated the idea that I might come speak at one of their conferences. I indicated that I’d be interested.
As a result, I’ll be appearing on a panel at SES Chicago. The other two panel members are, as it happens, the very same people I bad-mouthed back in August. So that ought to be interesting. More information, and links, are available on the Events page at Complex Spiral Consulting.
So what standards-centric question(s) do you think I should ask of these SEO experts? The one that’s top of my list is: “Exactly what effect, if any, does semantic markup have on search engine rankings?” A related question: “How does content ordering affect search engine indexing?” I’m sure there are others, and I’m happy to be a conduit for asking them of people who should know, and getting the answers back to you. So ask away—and be polite, please. I’m not going to be talking to comment spammers, but people who learn the ins and outs of search engine behaviors to help clients get wider exposure of their content. They’re not too dissimilar from those of us who learn the ins and outs of browser behavior to help clients get their content online in the first place. I failed to respect that before, and won’t make the same mistake again.
Comments (21)
A while back you linked a test that ran differently-structured markup through the search engines to attempt a scientific analysis of how semantic markup determines weighting. I referenced this at WE04, and was approaced by an SEO guy who basically said it was wrong, and semantic markup does have an effect on weighting.
Of course the results of the test would be disappointing, he said; a test page wouldn’t be exposed to real-world linking conditions. PageRank works with incoming links, so weighting depends on those more than anything. Semantic markup is a boost to existing ranking, it appears, but not a magic bullet. Inbound links are still largely the determining factor.
Oh, and that’s only meant to be anecdotal. I’d definitely like a second and third opinion before citing it again.
Earlier this year I worked on a site that had been passed over by by the search engines and had very few links to it. After redesigning the entire site with semantic markup etc, but still nothing much in external links, within 2 weeks it was appearing in the top 10 for the required search parameters in all the big name search engines. Soon after came external links and the number one slot.
I can only attribute the success of this redesign to the markup being search engine friendly as the content was only updated a little.
I look forward to hearing the answer to your top of the list question. I am already more than convinced and the client is still doing cartwheels! Their business has been given a boost and they now show much more interest in updating their pages, which is easier that it used to be.
I’d be interested to know how the reputation of a particular site (read: domain name) affects the rankings of individual pages within that site. I’ll use my personal site as an example.
jeffcroft.com does pretty well on Google. Most the traffic I get from Google is related to an article I wrote on how to construct a Hold ‘Em (poker) table. This, I’m sure, is mostly due to a great number of inbound links from poker sites that are heavily trafficed. I’ve also made a few standards and design-related posts that have gotten a good number of links from elsewhere, so this all makes good sense to me.
Now, a few days after Halloween, I posted a blog entry with some photos from a party I attended in which I (quite embarrasingly) wore a “catwoman” costume. To my knowledge, there have been little to no links to this post. There have certainly been no links to it from any site of significance. However, days after the post was made, my site now (quite embarrasingly) returns as the second listing on Google when you search for “catwoman costume”.
The only explanation I can derive is that jeffcroft.com has a decent reputation within Google, and therefore hits on my domain are given some kind of preferential treatment. I’d love to know if this is true or not…
Thanks in advance!
It would be very nice if you could raise those questions, and be able to push those SEO people for more (and it would be nice as well to hear from search engines as well). Let us know what comes out of it.
In my experience, good markup does help. As does incoming links, as does good content. Especially the latter. That’s one problem I have with my clients: getting them to describe their business or products in more than just a few words and some (pretty ?) pics.
Jeff, similar to you, I come second on my personal site in Google for a search for Strange Spiders! This was after posting an anecdote about painting our house and some paint splattered arachnids escaping the brush. Could it be that the bots get addicted to clean code and come back for more?
The number one question I have for SEO “experts” is how do they know how search engines “work” … examples and results of test cases would be nice … but it’s a trade “secret” I’m sure.
OK enough with the snarky Q’s :)
Really I am interested to see what value they place on correct semantic markup. Considering the number of “cheats” many SEO’s recommend such as using incorrect code (putting half your content in an H1 and the rest in an H2 for example) and search engines now working unfavourably towards such practices.
I have had little trouble getting well marked-up sites to have good rankings on google.
Why aren’t these guys our biggest supporters, because IMHO I think they should be in favour of easy to parse, semanticly correct code.
I think those of us on the front-lines often cringe the instant SEO’s become involved in a project. I know I have done more than a few things because clients have demanded them on the the advice of an SEO … not once have I thought them sound, intellegent changes.
Are SEO’s aware of this, are they trying to do a real job? or just make a quick buck of the gullible?
Don’t get my wrong there is definitely a real job / place for SEO services and I have in the past always factored in such services in my quotes. But I can site real, proven, valuable things I do with this allocated time. And I have the results to prove it.
What are the experiences of SEO’s with feeding different content to searchengines (hiding certain non-vital elements, daily quotes for example) so search results don’t get polluted?
What are the ethical considerations of doing this?
On the question of metadata: do SEO’s think that a standardised metadata set (such as DublinCore) is ever going to be adopted? Do we need it in this day and age?
Language information in XHTML or HTML is a topic I am interested in -Because I am from the danish language area and I am not always happy with the results the Search Engines gives when I am searching. For example when I am search in Google Denmark and click the options “show only pages in Danish” it’s give me a lot of pages outside the danish language.
My questions is, does Search Engines rank web pages with or (da=danish) better or does it not matter at all to incl. these language attributes?
My own experience from Denmark says that not many webdesigners use these language attributes at all. Richard Ishida from W3C says that it’s allways a good idea to incl. language information in (X)HTML documents. See
http://www.w3.org/International/tutorials/tutorial-lang/
Without wanting to sound too cynical, the secretive and slightly immoral nature of much Search Engine Optimisation seems to lend itself well to snake oil salesmen. The saying goes “You can’t con an honest man” and looking into the psychology of it all, I would guess that your average dodgy SEO customer doesn’t want an answer as obvious as sensible mark-up. They want “secrets” and “tricks” that are only available to the select few, the “easy money” available to those willing to “seize the moment”.
To be clear, I think people with actual content should have access to information that allows them to avoid silly mistakes and present their sites in the best light possible. I cannot help but think that correct semantic markup will help, if even just because there will be less ‘junk’ surrounding the actual content, and because people forget that spiders can’t read text in images. But as soon as you start trying to ‘fool’ the search engines you’ve crossed a line into lies and fraud and for that kind of market a slightly mystical ‘secret’ will always sell better than the straightforward truth (see: crash diet fads vs. eating healthily, expensive cables vs. audio engineering, every popular management theory ever, etc. ) and this is true regardless of actual results. In fact if there is no clear, independent way to gauge success all the better as that requires people to return to the ‘expert’ for guidance.
Hmm, I think my cynicism won through there in the end, so I’ll end with my cynical advice: even if it isn’t true that semantic, accessible, standards-based code is better for search engine ranking then tell them it is anyway, and make sure it gets repeated enough until it’s accepted wisdom.
I just remembered a thread over on mezzoblue relating to sIFR that went off on the tangent about Flash/Image replacement and its effect on search engines. Some folks were apparently concerned that eventually search engines would begin parsing not only (x)HTML, but also CSS and/or Javascript in order to detct if people were using image replacement techniques. Obvsiouly, these replacement techniques could be used to “mask” the real content a visitor would see with some bogus content in the (x)HTML.
I’m pretty sure no search engine does not this now, but I guess it’s possible at some point. Seems a bit ridiculous, though. Who knows.
I have a couple, actually. Yes, search engines can’t parse the text from images, and typically ignore text in html comments. But what about the alt attribute? Or the title attribute in anchor tags? Or the information in acronym tags? Is this information used in the page ranking process? What, if any, accessibility information is included in ranking algorithms?
I’m in the process of cleaning up my design act, so to speak, and I’d like to give my sites and my clients’ sites good rankings, good markup, and good accessibility. Thanks.
Trackback ::
Web Developer News
JupiterMedia presents Search Engine Strategies Chicago
What: Search Engine Strategies Chicago
Where: December 13-16, 2004 at McCormick Place, Lakeside Center in Chicago
Event Overview:
Organized by world-renowned search authority Danny Sullivan.
Delivers real-time actionable information you nee…
I’d like to know the effect of using display:none in CSS on pages. I had designed a page to have nice headings in text-based browsers using <h1> (and to be read by search engines, I’ll admit), but hid those headings via CSS as they were repeated graphically elsewhere on the page. After a little while, a Google search on the words in the heading returned us ranked farily highly, but then a little while later not at all. My suspicion is that Google “caught on” to the fact that the heading wasn’t visible to graphical clients and dropped our ranking accordingly. At any rate, I’m curious whether hidden or otherwise invisible text will be considered, disregarded or even punished.
I have recently completed a project for a national non-profit organization that had trouble ranking in Google for important keywords.
We redesigned the site using standards and clean semantic based markup. The site had over 400 articles. Most articles ranked between 50-150 on Google. Now many are ranked on the first page of results on some topics. Some are still buried down around 150 or greater. Most of this is due to ‘page rank’ and the number of links to a particular topic.
What I have learned from this project about Standards based markup and SEO:
1. Page Rank – is the #1 factor – i.e. how many sites link to an article and the linking site’s associated page rank.
2. HTML “title” tag is the second biggest factor.
3. keyword density is third.
4. keywords in semantic based tags. i.e. h1, h2, li, etc.
5 Meta description tag.
6. page weight.
The actual order of those is debatable. It is just what appears to us based the tweaks we have done on the 400+ pages on their site.
Bottom line. Standards based design did make an impact on American Pregnancy Association, because:
1. we got rid of tons of useless in-line JavaScript which bogged down the bots. Bots bail after too much upfront garbage like JavaScript and never get to the important first 250 words in the actual content.
2. The navigation to their site was done all with graphic files. We changed the nav to text based menus using the standard “ul” list methods that Eric teaches in his books. This added a lot of good keyword terms like “unplanned pregnancy” and “labor and birth” to the html that were previously invisible to the search engines because they were gifs.
3. Semantics! The use of h1, h2 tags and list items are very search engine friendly.
4. Overall page weight was cut significantly by going to standards. All things being equal search engines rank the fastest page to load higher.
Now, the million dollar question is, how much did the standards based design take them from page 12 in Google to page 1? Don’t know, but I’m sure it did help some. “Pregnancy Symptoms” ranked down in the cellar before and as of today is on the first page of results for Google and number 1 in yahoo!
Be happy to share more of what we learned on this topic. Just send me an email.
American Pregnancy Association web site
David Wilbanks
What would be interesting is a comparison of ethical SEO techniques and the W3C WAI checkpoints, see which ones complement each other and which ones are at cross-purposes. From a methaphorical aspect Google is mentioned as being the blind and deaf user telling his billions of friends about the sites he can see.
It would be nice to emphasise that accessibility and proper search engine optimisation are essentially the same thing – providing access to content.
Some other posts about the complement of accessibility and SEO:
I’d add people should think of the things they want to ask the search engines themselves, rather than just SEO people, about developing standards and search engine indexing.
Many of the SEO people who practice on site/ethical/white hat/content driven/call it what you will SEO are oftentimes messengers delivering bad news that they have no control over. It’s not their fault that search engines aren’t reading style sheets, for example — nor that they cannot parse words out of images. So they can tell you what tradeoffs have to be made — but ultimately, it’s the search engines themselves that must change and advance for those tradeoffs to go away (and by the way, often the tradeoffs will be extremely small).
With luck, there will be two search engine reps on this panel as well — so think of what you want for them to hear from Eric, too.
Only a few days to go, so I thought I woudl add my $0.02.
Why not ask what everyone that works in the online industry can do to grow, collectively, the % of marketing budget spent on the web? Seems to me, that SEO, designers and everyone else that makes money from websites are either competing for a share of a limitted online spend, or working to grow the total pool of funds. What can we all do make the pie grow??
Your blogg is smashing! Payday Loans http://www.payday-express.com
I’ve got to admit, I have always used Semantic Markup for my sites and it appeard to work – I never really thought about it in scientific terms!
To me, it just seems to make sense, irrespective of SEO and if it works for SEO, then thats great too!
However I would be interested to hear what the result of your panel was… Or was that too long ago now?
You can see my results here: http://www.hotproject.com