Archive for January, 2007

Spending An Afternoon With Matt Cutts


****************
* *
****************
How would you like to spend an afternoon with Matt Cutts? This past summer Cutts released a series of fifteen videos answering a number of questions asked by webmasters as well as providing tips on how to get the most out of conferences etc.

Since I am not sure about the Google Video deletion policies, I’ve taken the videos from Google Video and made local copies of them for preservation purposes. I’ve posted them below after checking Matt’s usage restrictions. Obviously, the videos here are extremely small, so click the ‘Larger Video and Transcript’ link to get, well, a larger video and the transcript.

Feel free to take these videos and embed them on your own blog!

P.S. Sometimes the videos don’t all load. If they don’t all appear, just refresh!

Matt Cutts #1: Qualities of a Good Site Matt Cutts #2: Some SEO Myths

Length: 5 min 40 sec
Larger Video and Transcript

Length: 4 min 10 sec
Larger Video and Transcript

Matt Cutts #3: Optimize for Search Engines or for Users? Matt Cutts #4: Static vs. Dynamic URLs

Length: 4 min 25 sec
Larger Video and Transcript

Length: 4 min 30 sec
Larger Video and Transcript

Matt Cutts #5: How to structure a site? Matt Cutts #6: All About Supplemental Results

Length: 4 min 46 sec
Larger Video and Transcript

Length: 4 min 12 sec
Larger Video and Transcript

Matt Cutts #7: Does Webspam use Google Analytics? Matt Cutts #8: Google Terminology

Length: 5 min 11 sec

Length: 4 min 40 sec

Matt Cutts #9: All about datacenters Matt Cutts #10: Lightning Round!

Length: 4 min 36 sec
Larger Video and Transcript

Length: 5 min 2 sec
Larger Video and Transcript

Matt Cutts #11: Reinclusion requests Matt Cutts #12: Tips for Search Engine Strategies (SES) San Jose 2006

Length: 2 min 44 sec
Larger Video and Transcript

Length: 5 min 40 sec
Larger Video and Transcript

Matt Cutts #13: Google Webmaster Tools Matt Cutts #14: Recap of SES San Jose 2006

Length: 6 min 34 sec

Length: 9 min 0 sec
Larger Video and Transcript

Matt Cutts #15: Data center comments

Length: 8 min 55 sec
Larger Video and Transcript

Matt Cutts #15: Data center comments

Here’s the fifteenth in the series of videos posted by Google’s Matt Cutts to Google Video over the past year. These are important for every web developer to see. Please see Matt’s first!

See the rest of the videos!

Transcription

Hey Everybody! Good to see you again!

I thought I would talk about datacenter updates, what to expect for the next few weeks in Google and stuff like that this time.

But before I do, I didn’t get to talk about fun schwag from the Search Engine Strategies conference. One of my favorites, check it out (holds up a a hat), its a white hat. Oooh! It got SEO in hidden text. Don’t say SEOs don’t have a sense of humor.

I thought this one was kind of fun (holding a picture), picture of Jake Baillie fake autograph there and here I got a real autograph. Infact I got several of them. Oh Yes. What can I do with lots of pictures of Jake Baillie? May be I can sell them and do some arbitrage or something like like that.

Anyway!

Also there was at least one British SEO, who evidently wants to keep me from doing anything productive for a long long time. Check that out (holds up a stack of three voluminous books). That’s three thousand five hundred plus pages of science fiction. Huh. Yes. The funny thing is, in Briton, these three books are published as three books and in United States, they take these three books and publish them as nine books. What does that say about British readers versus American readers? Yes, that’s what I thought. So probably I donate this to the Webspam team whoever needs some hard SEO, hard SciFi I should say.

OK! Data Center Updates.

So, There are always updates going on, you know, practically daily, if not daily, a small portion of our index is updated every day, not small portion but a pretty large fraction of our index is updated everyday as we crawl the web. We also have algorithms and data pushes that are going out on a less frequent basis.

So, for example, there was a data push on June 27th, July27th and then on August 17th. And again, its an algorithm that’s running for over 1.5 years. If you seem to be caught in that, you are more likely to be reading on an SEO board. So, you might want to think about ways that you could back your site off, think less about what the SEOs on the board are saying and how you can sort of not be optimizing quite quite as much on your site. That’s about as much as the advise I can give, I am afraid!

BigDaddy was a software infrastructure upgrade and that upgrade was finished around in February. And so it was pretty much a refresh to how we crawled the web and how we partly index the web. That’s been done for several months and things have been working quite smoothly.

There was also a complete refresh or update of our supplementary results index infrastructure. That happened a couple of months after BigDaddy, So it is been done for a month or two and it was a complete rewrite. So the indexing infrastructure is different than our main indexing infrastructure. So, you expect to see few more issues whenever we roll that out. We saw, you know, more small, off the beaten path stuff, like minus or exclusion terms where you use the minus sign, the no index meta tag, stuff like that. And the way that the supplementary results worked with the main index, you would often see site:results estimates that were too high.

There was at least one incident where there was a spammer that some people thought had 5 billion pages and whenever I looked into it, the total number of pages that their biggest domain had under 50000 pages. So they have been adding up these site:estimates and ending up with a really big number, that was just way, way off.

So, one nice thing is we have another software infrastructure update, which improves quality as the main aspect but it also improves our site: result estimates as well. Its just sort of like a side benefit. I know that, that is not at all data centers in the sense that it can run in some experimental modes, but its not fully on at every data center. And, they were shooting for the end of the summer to have it live every where, but again, that’s a hope, not a promise. So, if things need more testing, they will work for longer to make sure that everything goes smoothly. And if everything goes great,then they might roll it out faster. But, that is a really nice infrastructure. Its just a side benefit that site: result estimates get more accurate.

Its kind of interesting, let me talk about it for a minute, because I saw at least one guy who had said, you know, “what happened with site: result estimates on Google” and he was comparing two completely different data center IP addresses and they were different and he was worried about that. And yet, he had exactly one page in Yahoo, he had no pages in Ask. If you look at his link page, there were a ton of links to pharmacy sites, not just one pharmacy site, but a lot of pharmacy sites.

And so, I would say, your time, your focus, is better spent looking at your server logs, asking how to improve the quality of your own site and not worrying about something like site: results estimates.

So let me drill down some reasons, why that’s true.

Number one. They are estimates. We don’t claim that they are exact. In fact, if you look at them they are only exact to three significant digits. And we do that to give people an idea of how many results there are from a ’site:’ query. But, we don’t claim that that’s a 100% precise.

And truthfully, I didn’t consider it very high priority. There was recently a change that was pushed out that made the plain old results estimates much more accurate for unigram or single word queries. And I spent about half hour with the guy who did the change. And he even asked me, “well do you think its worth working on making the results estimates for site: more accurate?”.

And this was like 5,6 months ago,may be eve more. At that time I said, “No!pretty much nobody pays attention to those. You know, they look at their server logs, its not really a high priority”. And its gotten to be where more people are asking about these things and I am sure we will pay more attention to it.

But, in general I would spend more time worrying about good content on your site, looking at your server logs to find out niches where you can make new pages and make things that are more relevant.

And you know, the whole notion of watching data centers is going to get harder and harder for individuals going forward, because number one we have so much stuff launching in various ways. I have seen weekly once launchings where there are double digit number of things, and these are things that are under the hood. So, strictly quality. They are not changing the UI or anything like that. And so, if you are not making a specific search in Russian or Chinese, you might not notice the difference. But it goes to show that we are always going to be rolling out different things and at different data centers you might have slightly different data.

The other reason why its not worth watching data centers is because there is an entire set of ip addresses and if you are a super-dooper gung-ho SEO, you’ll know, you know, oh, 72.2.14.whatever. But that IP address will typically go, to one data center. But that’s not a gaurantee. If that one data center comes out of rotation, we are going to do something else to it, we are going to actually change the hardware infrastructure. and everything I have been talking about so far is software infrastructure. So if you take that datacenter out of rotation for some reason, that ip address will then point to a completely different data center. So, the currency, the ability to really compare changes and talk to a fellow data center watcher and say, “What do you see at 72.2.14.whatever” is really pretty limited.

So I would definitely encourage you to spend more time worrying about you know, the results you rank for, increasing the quality of your content, looking for high quality people that you think should be linking to you and may not even know about it and stuff like that.

I just want to give people a little bit of update on where we were on various infrastructure and the fact of the matter is that we are always going to be working on improving our infrastructure, so you can never guarantee a ranking or a number 1 for any given term, you know, because, if we find out that we think we can improve quality by changing our algorithms or data or infrastructure or anything else, we are going to make that change.

So the best SEOs in my experience are the ones that can adapt and they would say, “Ok, if this is the way the algorithms look right now to me, and if I want to make a good site that will do well in search engines, this is the direction I want to head in next.” And if you work on these sort of skills, then you don’t have to worry about being up at 3:00 AM and talking on a forum about “What does this data center look like to you?, Did they change a whole lot” and stuff like that.

So that’s the approach that I recommend.

Transcription thanks to Peter T. Davis

Matt Cutts #14: Recap of SES San Jose 2006

Here’s the fourteenth in the series of videos posted by Google’s Matt Cutts to Google Video over the past year. These are important for every web developer to see. Please see Matt’s first!

See the rest of the videos!

Transcription

OK Everybody! I am back. I am mostly over my cold and my wife is somewhere else tonight. So I get to make a video. Muhahahaha…

So, I thought I will give you a recap from my point of view of Search Engine Strategies and sort of cover some of the high-order bits and stuff that I thought was pretty neat.

A lot of people are curious about the industry news. What did the Search Engines announce, or what happened during the week.

So, Yahoo announced Sitebuilder, which is something that lets you do a free custom search engine for your own site, Google has something that’s sort of related to that but we rolled out several years ago. So Yahoo for now looks like they have a slightly nicer custom site search, thats free.

They also rolled out authentication in Site Explorer. So one thing you’ll notice is, you can now prove that you own a site in site explorer and presumably you will be able to do more stuff down the road.

They also turned off the ability to do site: and a domain on Yahoo. So a lot of people missed that during the conference. Its now a forced redirect to Yahoo’s http://siteexplorer.search.yahoo.com. So, you’ll have to login if you want to do a site: search on Yahoo now. You might be able to do a ‘-a’, ‘-the’ to get around that. But it’s pretty clear that they want to shunt most of the people doing the SEO kind of research to that one site and leave the main site for the regular searchers.

So, what did Google announce? Well, we rebranded and renamed Sitemaps to Google Webmaster Tools and there is new Google Webmaster Blog, the Sitemaps blog has been renamed. So lot of stuff has been reorganised, so its all in one spot, and one place you can go to.

There was also the refresh of the supplemental results, which is kind of nice. People who were complaining about results being from about 2005, I believe, by the end of the month will have those new fresher supplemental results out everywhere. But the supplemental results are basically, mostly in the April, May, June, July time frame, The earliest drop I know of is in February, so I know a lot of people are happy with the refresh of the supplemental results.

We also released a click fraud report. Kind of interesting. The auditing paid clicks session was a kind of barn burner. Guess what, you had to be there. Lot of fun. If you don’t want to read the 17 page report, I would just read the appendix, where they sort of talk about mathematically impossible things and give some concrete examples. But it is a pretty interesting report, if you want to read it.

Microsoft didn’t really announce much and I actually support that. I don’t think Search Engines should try to roll stuff on a conference schedule, because then all the events get squashed into one and you sort of get lost in the noise. So I think its a not a bad idea to roll stuff out when its ready and not worry so much about launching during a conference, trying to get a big boost because of the press. So, all other search engines, including Google, don’t launch anything during the conferences. make life more mellow for everybody.

Probably the biggest industry news that happened was inadvertent. And that was because AOL accidentally, well semi-accidentally, leaked queries for hundreds of thousands of users and million of queries and stuff like that. It was done in good faith, the researchers wanted to provide data to people to learn more about how people search with search engines. But it took about a day before people realized now that it can be tied to individual searchers and stuff like that. So, people have probably heard the follout from that over the last couple of weeks, so I don’t need to talk about that.

It was an interesting conference because I got to meet a few people for the first time. I got to meet Loren Baker, Jason Dowdle, Shawn Hogan, Jim Hedger, Steve Bryant from eWeek. I enjoyed meeting everybody there. I enjoyed talking to a lot of people, from the lady from Netshops to the guy that I shared lunch with. It was a lot of fun as far as lot of talking to webmasters. People over there that I didn’t get to talk to but I would have really liked to talk to, Lisa Barone, I don’t know how to say it, from Bruce Clay, it sounds like Melanie Colburnfrom John Batelle’s Blog, Andy Beal, didn’t get to talk to him this time, hopefully next time, and Donna Bogatin was there, sent a couple reports in but I never got to talk to her.

Other things that happened, it was actually a conference in which there is a lot of changes that happened. It sounds like Andy Beal is moving to a different spot. Mike Grehan is moving to a different spot.

This is one of my favorites, nobody else I think noticed this but Jeffrey McManus who is Yahoo search developer or something like that or an API guy. He left Yahoo. If you are not familiar with the name, he is the guy who said that Google Maps API smelt like wet burnt dog hair or something like that. So, he is no longer at Yahoo. I think he is consulting now. So, if you want to get good consulting, I am sure you can talk to Jeffrey McManus.

Kanoodle, something happened with them. They moved to ‘SeeVast’ or something like that. And at first I thought it was something like a name change, but evidently, they have something with Moniker or Moniker’s naming page stuff. I didn’t get to talk to Monte or Erik of Moniker and find out what knoodle is upto. But, that’s kind of interesting.

Probably the biggest change, that I thought was entertaining was Niall Kennedy left Microsoft, which was kind of funny moment because it started out that we were going to have search engine blogger round table, and I think Robert Scoble was scheduled to be on the panel and he left Microsoft. And so, Niall Kennedy was scheduled to take his spot. And then he announced that he was going to leave Microsoft, and he was leaving like three days after this panel.

So there was atleast one point where I was looking at Niall and somebody from Microsoft talking. I couldn’t get what they were saying but I was imagining the Microsoft PR guy going, “You are going to be cool, alright?” and Niall like, “Yes, yes I am going to be cool.” And he was. He did a great job. he told a really funny story about international soccer and how you can avoid incidents by thinking about the impact of your words. So, it was a lot of fun being on a panel with him along with Gary Price and Jeremy Zawodney.

Other fun moments. I missed, I can’t believe this, I missed Danny Sullivan in lederhosen. He lost a bet with Thomas Bindle and there are pictures all over the web. Just Do a Google search or some other image search you should be able to find Danny Sullivan in lederhosen.

I got to talk to a lot of metrics companies and grill them about various things. I still got a few posts to talk about metrics.

Picking brains of webmasters, of course, they picked my brain a little bit. Its always good to talk to web masters, I enjoyed that a lot.

It was fun to meet some Cuttlets. So, Jessica and Audry from an SEO down in LA. It was really nice to meet you. Lyndsay, it was nice to meet you as well. Didn’t make my wife jealous at all. No sir. No marital problems there, I’ll tell you. But it was a lot of fun meeting a ton of people, including a couple of Cuttlets.

I got a killer cold, which I am now over, so that pretty good.

And there was one heart stopping moment where Danny was talking to Eric Schmidt, the CEO of Google. He did a Q&A on the third day of the conference. And Sergey showed up at SES back in 99 or 2000 and he said something like, there is no such thing as search engine spam. Which, back then was basically true because Google was using pagerank and links and anchor text in ways that nobody ever thought of before and it was very hard to spam Google and nobody worked on it, because Google was really small.

But that quote haunted Google or atleast webspam for a while. “There is no such thing as spam”, said Sergey. So there was one moment when Danny Sullivan asked Eric Schmidt. He said, “Oh all this link stuff, people are always going to be trying to abuse it. Do you want to just go ahead and say now that everything is OK, there is no such thing as spam, you can do whatever you want”. He didn’t say exactly like that, but I still have this heart stopping moment. I was like “Eric, say the right thing, say the right thing..”. And he did a fantastic job, heart attack avoided. ..It was really a neat affair there and talk about the importance of web masters and communication and stuff like that.

So, it was a lot of fun. it was a good conference. I am going to be out of conferences until may be WebmasterWorld, Las Vegas in November. So, I am looking forward to some quiet time at home and just working on spam and stuff like that. But it was a lot of fun and if I got to meet you at the conference, I am glad I did. and if I did not, I hope I meet you in a future conference.

Transcription thanks to Peter T. Davis

Here’s the thirteenth in the series of videos posted by Google’s Matt Cutts to Google Video over the past year. These are important for every web developer to see. Please see Matt’s first!

See the rest of the videos!

Transcription

Hey everybody! This is Matt Cutts. Its Monday, August 7th, and going to be the first day of Search Engine Strategies. I have been picking SEO’s brains on Saturday. So, already started to loose my voice a little bit. But, I wanted to alert you to some stuff that people might have missed that just happened this past Friday. I think it might have gotten missed a little bit, because it happened at 9′o clock on a Friday and partly because, like a large fraction of the A list, B list and C list, bloggers about search, are all sort of on their way or arriving at Search Engine Strategies, San Jose.

So, Google has actually done quite a bit more lately to revamp the amount of information we provide to general users and to webmasters. So, one thing is http://www.google.com/support has been beefed up a whole lot. So, all the different support stuff , there is a lot more answers with a lot more fresh information. Its pretty cool. If you go to google.com/support, that is sort of the one stop shop for all sorts of general user support needs.

However, if you found your way to this video, you are probably not just a regular user. You are probably also a webmaster. And if you are a webmaster, there is a tool you need to know about, which used to be called ‘Sitemaps’ until Friday.

It all started off sometime last year, when this tool called ‘Sitemaps’ let people submit all the urls that were on their sites. They could even say things like: when they had last changed, which urls are more important… all sorts of stuff. And lot of people made tools to create those sitemaps files and that was fantastic.

The thing that happened after that is, the Sitemaps team decided to build a more general console, something that could help webmasters with all sorts of other problems. And so, that’s been called ‘Sitemaps’. But I know, Adam Lasnik came back from Search Engine Strategies London and said that when he talked about sitemaps, everybody thought, oh XML files or stuff like that. So just this last week, ‘Sitemaps’ changed their name.

So, there is now an official area called Google Webmaster Central and if you go to that, its just http://www.google.com/webmaster or webmasters, I’ll make sure that they both work, you will get a set of lots of different tools. There is now an official Google Webmaster Blog, which is going to be mostly maintained by Venessa Fox and I am sure, I will stop by from time to time to weigh in on various things. But that used to be the Sitemaps blog, and the scope of it is broadening to now include anything related to webmasters, which I think is fantastic.

The other thing is, the sitemaps tool has now become the Google Webmaster Tools. And it got all sorts of stuff. Its not just a place where you can tell people, here are all the urls that I’ve got, Google come please call those urls. Just off the top of my head, it has got robots.txt checker, it has got things to show you what errors in urls it has seen….

Earlier today, in fact, I found where I had made a link without the http and that doesn’t work so well in Wordpress. So, I had gotten 404 errors whenever Google tried to crawl. So, I was actually, able to fix a broken link by looking at that table.

In some cases we can tell you whether you have spam penalties or not. So, if you have hidden text or something like that, we can actually show you that you have a penalty and actually give you a re-inclusion request, which we can give a little more weight to, because we know its you, you verified and proved that you really own that site.

They also just did a new release on Friday, along with change in the name and they introduced a lot of different pretty neat little stuff. Things like show me all the query words that show up in each subdirectory, or show me the crawl errors in each subdirectory and things like that.

However, the biggest thing that I am really happy about is something called preferred domain. Sometimes we see, whenever people have their links, you know, not as uniform. May be they don’t have all their ducks in a row. And so, some of the links point to www.mattcutts.com and some of the links point to just mattcutts.com. So, without the www or with the www. And, if some people from out side of you, like the ODP or whatever links to one and other people link to the other, Google tries to dis-ambiguate that. It tries to figure out, oh www and non-www are actually the same page and they are always to going to be the same site. But we can’t always get that 100% correct.

So this new feature in Sitemaps, the Google Webmaster Console or Google Webmaster Tools, whatever you want to call it, now lets you say, “OK, I verify, I own this domain and I verify I own it with-the-www as well. Now, treat those as the same.”

Now bear in mind, its a preference, so the first thing is, it might take several weeks for it to go into effect. The next thing is, its a preference, so, we don’t one hundred percent, guarantee that if you say, “I want www”, we will always go that way. But in the normal typical situation, with in a few weeks you should see your urls change from split between www or non-www, if you have this issue, to all being on which ever one you prefer.

I volunteered my domain to be used as the guinea pig by the crawl guys, so they were whipping it back and forth from www to non-www and things are looking that they are working pretty well.

So, PropstodayOUK, asked this feature, a bunch of other people have asked for this feature. I am glad we are getting around to it. I am sure we continue to keep looking for ways that we can take request from webmasters and try to turn that into useful information that they can get.

So if you haven’t taken a fresh look at the Google Webmaster Tools, I would highly recommend that you do that. Its worth your time, you can find all kinds of errors, you can test your robots.txt, you can sometimes see penalties. There is words that you rank for, words that you get ranked for and got clicked on a lot and most importantly, there is this www and non-www. So, if you have been effected by that, you can now tell google, which way you want it to be.

The Sitemaps team has been doing a great job. I am sure I’ll continue to call them ‘Sitemaps’ for a while, not being able to get used to the name change. But I’ll get used to it eventually. I hope that you will give it a try. I think it can be useful for anybody who’s got a site.

Transcription thanks to Peter T. Davis

Matt Cutts #12: Tips for Search Engine Strategies (SES) San Jose 2006

Here’s the twelfth in the series of videos posted by Google’s Matt Cutts to Google Video over the past year. These are important for every web developer to see. Please see Matt’s first!

See the rest of the videos!

Transcription

OK! This is Matt Cutts. Its Monday, about 1:00 AM, which means Search Engine Strategies, starts in about nine hours. And fictional reader Todd Smith writes in and says:

“How would you recommend doing Search Engine Strategies? What tips or tricks can you give us, because I am going to the conference for the first time and I want to get the most out of it.”

That’s a great question, fictional reader Todd Smith.

First off I would say, go ahead get checked in. You are going to get probably a bag with like 14 pounds of stuff in it. I would go through there, pick out basically just a little sheet of paper, that’s like four pages, that’s like, here are the sessions. And I would pretty much take the rest of the stuff to the hotel. You have probably checked into a hotel that’s right near the convention center, so just drop everything off at the bedroom.

Here is what I do. I take a back pack (showing a backpack). I also take my little pad of paper to write down feedback. Its a JamSport backpack. You’ll notice that its the exact same kind of backpack that Sawyer uses on Lost and if I were going to be trapped on a desert Island, this is what I would want too, because there is actually two completely different pockets. So you can put food and water in one. You can put your laptop and charger kind of stuff in the other. So if your water leaks, you are not going to destroy your laptop. Its water proof. Works very well. You bring your laptop, throw that sucker in there along with the schedule and then if you go on to the Expo and if you pick up some collateral brochures, throw them in there and you are in good shape.

I would probably sit down and circle the sessions that would be of interest to you. For example, to me, the talk about Search Landscape; Bill Tancer of Hitwise is going to be there. So I would like to be a fly on the wall and ask them some questions about, “do you use paths in your metrics?”, “what do you do with AJAX?” and stuff like that.

Also on Monday, I think the lunch with the sitemaps team is going to be pretty interesting. Sitemaps team just rolled out, on Friday, lot of new changes and infact sitemaps has been renamed to the Google Webmaster Central. So its now a general webmaster console. So major probs to them for doing that, so may be I will talk about that little bit more in future.

And then on Monday, I have the focus group back at the Googleplex, so I have to leave and go home for that.

On Tuesday, I think, the ‘Auditing Paid Clicks’ session should be really interesting.

On Wednesday, I wouldn’t miss the Q&A with Eric Schmidt. And I am biased because I am on the panel, but I think ‘Search Engine Bloggers’ one should be pretty interesting too. Nothing but Q&As, so you don’t have to worry about powerpoint or anything like that.

There is a lot of parties. Its always fun to do the parties. The one thing that I would definitely try and get to do is our Google Dance, that’s Tuesday night and I’ll go ahead and tell you a little secret that not everybody realizes. Its the fifth Google dance. So we will have music, DJ, lot of food and all sorts of fun stuff.

The part that most people don’t know and we will try to get signs up, but I think this was a little too late to be on the main Search Engine Strategies program, is that we are going to have another, meet the Googlers, session during the party. So its mostly engineers, but we will also have a couple of product managers. People from all over the company, you know, quality, the ad side of things, webspam, people who have expertise in adsense, click fraud, all sort of stuff and that will be going on during the Google Dance. In fact the middle part of the Google Dance.

So, if you are looking at the cafeteria, where there is probably loud music. It’s sort of up on the second floor, all the way to the right. It’s a room called ‘University’, its like a little minitheatre. And we’ll probably have 10 or 12 Googlers, mostly engineers, answering questions.

So, if you want to take a break from the loud music and the dance seen and talk search for a while, please stop by and say hello. That’s probably where I‘ll be. We’ll hopefully have signs, but I’d love to see a lot of people coming and ask us questions.

You know those are the sessions that sounded really interesting to me, but if you are not a search engineer, who’s been doing it for a few years, you might find other sessions completely interesting. You know, search engine algorithm research, or if you are a marketer, you might want to go to completely different sessions.

The one tip that I would give is, I would probably say, go ahead and sit in the back. Because, you know, if for whatever reason, somebody starts going all ’salesy’ or something like that, you can just duck out, and the amazing thing about Search Engine Strategies is, you do have four different tracks going on at one time. So if one track isn’t interesting, or one particular speaker isn’t your cup of tea, you just duck out, go look at another one. And if nobody is good at that moment, you just sit down and do some wifi or something like that.

Over all have fun. The more people you talk to the better. if you see my ugly mug around and if I am not walking into a panel or getting ready for the next presentation, Please come up, say hello, introduce yourself. I am horrible about names and faces. You may have to remind me, “Hey, I am IncrediBill, we met in Vegas”, or something like that. But usually after a couple of times, I get it down. And, I love for as many people as possible to come up and introduce yourselves.

So, if you are going to be Search Engine Strategies, Sanjose, I hope you have a good time and I hope we’ll see you there.

Transcription thanks to Peter T. Davis

Matt Cutts #11: Reinclusion requests

Here’s the eleventh in the series of videos posted by Google’s Matt Cutts to Google Video over the past year. These are important for every web developer to see. Please see Matt’s first!

See the rest of the videos!

Transcription

Hey! This is Matt and Emmy coming to you on Thursday after hockey at the GooglePlex. Lets talk about, I don’t know, reinclusion requests.

So, I did a blog post about reinclusion requests a while ago. The procedure has changed a little bit though. So, imagine if you spammed or someone that you hired as a web master has spammed and now you are no longer in Google. What do you do now?

So the best thing that I recommend, is to register in sitemaps, or webmaster console or webmaster central whatever you want to call it. And, its basically the place where you can get all kinds of information. Sometimes, you can even find out if you have penalties on your site. We can’t show all your penalties that we have because, that would clue malicious spammers as well. But if there are real legit sites, that have valid content, we want them to able to be found. So we can show penalties for some sites.

So, if you do have a penalty or if you suspect that you might have a penalty, go ahead and register at sitemaps and then fill out a reinclusion request. I thinks it is like at the bottom left or something like that. And, the more information you can give, the better.

So, for example, if you are using an SEO or somebody that your webhost got hacked or whatever, give us as much specifics as you can. You also want to try to give some sort of timeline, here is what was going on, here is the mistake we had made. The most important thing is, Google needs to know that it’s not going to happen again.

So, some ways of letting us know or convincing us that, what ever you think the problem was, usually you might have a pretty clear idea, something like a hidden text, doorway pages, sneaky re-directs using Javascript, anything like that. We need to know that those pages, those violations of our quality guidelines are not going to comeback.

So that’s the procedure that I would go with. Try to include as much detail as possible about how it might have happened and what you are going do to make sure that it does not happen again. And then, that goes into a queue which we check and we try to find out, OK, has the hidden text been removed, stuff like that. So, reinclusion requests definitely get looked at by people and that’s the procedure I would recommend to use to put one in.

Transcription thanks to Peter T. Davis

Matt Cutts #10: Lightning Round!

Here’s the tenth in the series of videos posted by Google’s Matt Cutts to Google Video over the past year. These are important for every web developer to see. Please see Matt’s first!

See the rest of the videos!

Transcription

Alright. This is Matt Cutts, coming to you on July 31st Monday. This is probably the last one I will do tonight. So lets try to do a lightning round.

Alright! Peter writes in. Says:

“Is it possible to search for just home pages? I tried doing -inurlhtml, -inurlhtm blah, blah blah.. php, asp, but that doesn’t filter out enough.”

That’s a really good sugestion Peter. I hadn’t thought about that.

Fast used to offer something like that. But I think, all they did was to look for a ~ in the url. I will file that as a feature request and see if people are willing to prioritize it where we might be able to offer that. My guess is, it would be relatively low on the priority list, because of the syntax you mentioned subtracting off a bunch of extensions would probably work pretty well.

Ah. I get to clarify something about strong versus bold, emphasis versus italic. So, there was a previous question where somebody had asked about whether it was better to use bold or whether it was better to use strong. Because bold is what everybody used in the old days when the dinosaurs roamed the earth, and strong is what the W3C recommends. At that time, last night, I thought that we just barely, barely, barely, like an epsilon preferred bold over strong and I said, for the most part don’t worry about that.

The nice thing is an engineer actually took me to the code where actually I could see it for myself, and Google does treat bold and strong with exactly the same weight. So thank you for that Paul. I really, really appreciate it. In addition, I checked the code that shows that ‘em’ and italic are treated exactly the same as well. So, there you have it, go forth and mark up like the W3 would like you to do it, do you it semantically well and don’t worry so much about crufty old tags, because Google would score it just the same either way.

Alright. In the lightning round, GoodmanAmanaHVAC asks,:

“Will we see more kitty-posts in the future?”

I think we will. In fact I tried to get my cats in on this show but they are a li’l scared of lights. Lets see, if I can get them used to it.

TomHTML asks,:

“What are Google SSD, Google GAS, Google RS2, Google Global Marketplace, Google Weaver and other services discovered by Tony Rusco??”

I think it was very clever of Tony to try to do a dictionary tag against our services check-in, but I am not going to talk about what those services are.

What else have we got here.

Josef Humpkins asks,

“A Preview of what many of the topics might be in the duplicate content session of the SES.”

I gave a little bit of a preview in one of the other sessions on video. But, I think what we would basically talk about, Sherry will be there, a lot of people will be there, we will talk about shingling.

What I’ll essentially say is, Google does a lot of duplicate detection from the crawl, all the way down to the very last millisecond, practically when user sees things. And we use stuff that’s exact duplicate detection and we do stuff that’s near duplicate detection. So we do a pretty good job all the way along the line of trying to weed out duplicates and stuff like that. And the best advice I give is to make sure that your duplicate content, you know, pages which might have nearly same content, look as much different as possible, if they are truly different content.

A Lot of people worry about printable versions or somebody else asked about .doc or word file compared to an html file. Typically you don’t need to worry about that. If you have similar content on different domains, may be in French and another version in English, you really don’t need to worry about that.

Again, if you do have the exact same content, may be for a Canadian site and for a .com site, its probably just the sort of thing where we will detect which ever one looks better to us and and just show that, but it wouldn’t necessarily trigger any sort of penalty or anything like that. Or if you want to avoid it, you can try to make sure that templates are very very different. But in general, if the content is quite similar, its better just to let us show which ever representation we think is the best anyway.

And Thomas writes in and says,

“Does Google index or rank blog sites differently, than regular websites?”

That’s a Good Question.

Not really. Somebody else asked about links from govs, edus and whether links from two level deep govs and edus, like gov.pl are the same as .gov. And the fact is we don’t really have much in the way to say, oh this is a link the from the odp or from .gov or .edu.so give that some sort of special boost. Its just that those sites tend to have higher pagerank because more people link to them and reputable people link to them.

So blog sites,there is not really any distinction unless if you go off to blogsearch ofcourse, and then its all constrained to blogs. In theory, we could rank them differently, but for the most part, just the general search, the way it crawls out. Things are working out ok.

Alright!. Thanks.

Transcription thanks to Peter T. Davis

« Previous PageNext Page »