≡ Menu

Never let a Search Engine make a decision

When you build a website you need to completely understand how a search engine looks at you. Never let a search engine make any decisions. When a search engine comes to your website you need to know exactly how it is going to crawl your site.

Pages should only have one url. If you have a database driven website it is very common that the same content can be pulled up with an unlimited number URL’s. You have to enforce your url’s. Your content needs to be tied to only one url. Put logic into your site so that if it detects that content is being pulled up by the wrong url that you 301 redirect to the correct url. All URL’s should be consistent. The best way to do this is to make every url lower case. If you use a windows server you need to be extra careful because IIS will send you to the same content and produce duplicate content.

{ 5 comments }

I typed HYIP into google and saw a bunch of adwords advertisers and the first result was a deffinition by Wikipedia (of course)

“”A high-yield investment program (HYIP) is a type of Ponzi scheme, which is an investment scam that promises an unsustainably high return” The ads were obvious scams and the first organic result was a big warning. The people bidding on “hyip” are fishing for really stupid people. It just amazes me that people are that stupid.

Shouldn’t google adwords watch out for scams?

high-yield investment program (HYIP) Search on Google

{ 0 comments }

Google may be switching to AJAX based pages

The Internet marketing industry is buzzing about the possibility of Google switching to AJAX based pages. This would break many programs that scrape Google. Many people scrape Google for many reasons some harmless and some very malicious. Every time a visitor queries Google it costs them a very small am amount of money. This does not seem like a big deal until you consider how many people are doing it and how often.

This will probably only stop armatures. Scraping Google is so profitable that the pros will figure out a way and fast and some may already use tactics that are not affected by it. I’m sure that scraping Google has gotten so out of hand that this switch would save Google a lot of money.

{ 0 comments }

Google can now run Javascript

It appears that the first step in making flash content spiderable Google has added the ability for googlebot to execute javascript. I’m not sure how robust it is but I do know that it can. I’m working with a client that had a very large amount of content that was previously not in Google because the content required javascript to be seen. Out of nowhere this site has over 600,000 pages of new content in Google that was previously not there.

I was very confused at first. I knew that those pages were not there before and that Google should not be able to see them. I put one of the url’s in a spider simulator and the text was being shown. I put it firefox and IE with javascript turned off and content would not show up. I put in in lynx and the content did not show up either. I wonder why and how the top spider simulators are running javascript.

[tags]firefox, flash, googlebot, internet explorer, javascript, lynx[/tags]

{ 33 comments }

Google showing full address on Adwords PPC ads

I was doing a search today on a product name and I noticed that one of the Adwords results had an address right below it. It does not do this every time. Sometimes is just says Houston, TX.

Google Adwords Address

[tags]google adwords, google, adwords, address in link[/tags]

{ 10 comments }

Google one search showing 10 results

When I did a search on google for “houston accounting software”. I saw 10 results compared to the one to three they normally show.

[tags]google local, one box, local search[/tags]

{ 8 comments }

Social Media Spam is not for Links


Digg!

I was reading an article on the front page of Digg today that talked about how people spam social media sites to get more links. This is not true. The real reason is that sites like www.digg.com are search engines themselves. I discovered this one time by having an article in digg that did not get very many diggs but for some reason still to this day gets traffic from digg. What the spammers do is submit tons and tons of pages just so they show up in searches for all kinds of long tail search terms. It also helps to keep doing it so that they show up on upcoming searches. I wish Google was that easy to game. A lot of people don’t realize that digg is an awesome search engine. I use it all the time to find stuff.

[tags]digg, social media, spam, spamming, seo, google, search engine[/tags]

{ 7 comments }

I was trying to submit this hunting video to digg today and was told

“Your URL appears to be redirecting a bit too much for our tastes.”

This is just a normal ASP.NET webpage that uses web parts. ASP.NET does not work like most web programming. They do a lot of things for you. They really like to use 302 redirects to show pages.

Digg has decided that a site using ASP.NET web parts is spam I guess. This site has worked hard to make a video and picture sharing community for people into outdoor sports like hunting and fishing and now their members can’t share things they upload to digg. I think what digg did was try to stop spammers but what they did was block legit websties.

[tags]digg, digg.com, asp, microsoft, asp.net, asp.net 2.0, digg submition[/tags]

{ 16 comments }

Google Analytics Connection Speed stat

I have been working a lot with Google analytics a lot lately and I like it a lot. Before I used it I had asked some friends what percentage of their site visitors were dial up. They gave me these real low numbers like 3-5%. I thought that was real low. Specially in the niche they were in. Now that I have my own stats I see the same numbers. Right now I have 5.32% dail up users and 63.5% broadband. The rest goes to unknown. When you have 27% unknown those numbers are almost worthless. I would imagine that all broadband providers are known so there is very good chance those are dial up. Also T1 users could be dail up. Anybody can start an ISP with a t1 and some modems. These numbers are pretty much worthless.
[tags]google, google analytics, analytics, dial up, isp, internet, dsl, broadband[/tags]

{ 12 comments }

Why is HTML being left on the side of the road?

I am noticing that a lot of web developers consider HTML old school and something to avoid. This is a very bad trend. People need to realize that web pages are documents that need to be cataloged and searched. If you don’t use a universally accepted standard to mark them up companies like Google and Yahoo will have a harder time figuring out what a page is about.

HTML is a standard to define the elements in a text document. An HTML only document looks very ugly and plain in a web browser. Using HTML to design your web page is very old school and does not look very good. At the dawn of the Internet HTML provided 2 roles one being markup and the other design. We now have CSS that takes over the job of defining how something looks. People are so caught up in how a web page looks they forget about the markup side of a web page.

Any good SEO should know this. Part of the job of an SEO is to educate developers about this topic. We are document markup experts. Companies need to have an SEO on staff to make sure the developers don’t lose site of this.

[tags]seo, html, markup language, css, google, yahoo[/tags]

{ 10 comments }