SEO Question: Indexing Website Search Results

14

I have a search engine optimization question, and I couldn’t find it elsewhere. I know there are a number of pro SEOs who read my blog during some free time, and I am hoping someone can provide some feedback/advice for a site I am currently developing.

I am building a search based directory right now, and people can search for providers by city/state or zip code, which will yield the results pages, some of which will hopefully be filled with my advertisers. Since the pages will only technically exist when people search for them, will they be indexed in Google? If I create a site map with text links to all US cities, states, will that be sufficient?

For example, say I am developing LocalPlumber.com, and the results page for Chicago would be found when someone searches, yielding this url: http://www.LocalPlumber.com/?s=Chicago%2C+IL. Would the search result be indexed in Google – and if not, how can I be sure to get it indexed in Google/Yahoo for the keywords?

Thanks if you can help!

14 COMMENTS

  1. Create a city navigation section using links to the search result pages for each city. ( /?s=Chicago%2C+IL ).

    Any page you want indexed should be linked to from pages that are already indexed.

    cheers. -d

  2. On brand new site that just get launched, it is a waste of time and effort…

    1) these pages simply wont rank

    +

    this entire setup is just way too spammy in the eyes of the engines and can actually be counter-productive

    On more established sites that have been around for a while which are trusted and well liked by google for example, you can get any dynamic pages ranked easily almost infinitely but you are trying to jump the gun, so in conclusion… Forget about it 🙂

  3. Google can’t find a page that “might exist” if someone types it in – it needs to be able to find the page on it’s own via a link or sitemap file.

    The above solutions are good, and you could probably also have someone write a script to build links to these pages for the top 2000 cities in america etc, and give links to them.

  4. I go along with the canonical URLs too (permalinks?).

    I’ve heard from an established directory they use a shadow SEO-optimized site to get indexed, using text-only pages, which are quick. Possibly checking if it’s a crawler or not, then serving the page.

    Also, the home page can serve your sitemap when it’s a crawler.

    Having said that, strictly speaking, Google specifically frowns upon any “fudging” of pages for its view only, so maybe you can do this for a start and then go “straight”.

    Good point about duplicate pages – gotta make sure title, keywords and description vary enough from page to page.

  5. Its true that with too many pages appearing rapidly you will trigger a duplicate content filter, you should build it up slowly and only put a page up once its completely finished with its unique content etc.

  6. I have noticed that this site jseeker.com.au search results are indexed in google and rank well. Not sure what there secret is though.

    While your on the subject of SEO, I noticed lately that the url’s of your pages include a number at the end of it. Is this for seo purposes?

  7. If you are wanting to be found for a “certain service” like “local plumber” & “certain city” there is a easier way.

    I run such a site for a different ” Service” . The trick is to get listed in the index first. then build 20 – 30 backlinks which promote identical services. After aprox. 3 – 4 months build a text rollover, listing the main city’s on your front “Home” page.

    This is how I solved the problem it worked for me.

  8. The way to do it is create a dynamic page listing your citys with a links to the corresponding page. submit the page to bookmarking sites and place a link in your blog. Goggle can now index dynamic content quite well but for SEO purposes I would look at url rewriting.

    Note:
    Don’t have more then 100 links on a page if you need to create more pages do so. I have successfully used this method in the past to get directory style sites indexed.

  9. I have quite a large directory site that has every page indexed and crawled regularly. Before I built it I was tempted to use a CMS or have the results in a PHP database. I chose to hand create each page in bog standard HTML so google would have no problems accessing it all. It worked for me. I was up against tough competition with established big directory sites but my little static site is now up there with them.

    Is updating a nightmare? No, not really. I have cPanel and in there I use the file feature. If i get a new listing it takes me no longer to create the new listing or page than it would using a cms – I actually I think I can do it quicker now.

    Although its a static site I have added widgets and other stuff which make it more dynamic. The beauty of it though is I intended every page to get crawled no matter how many levels the file structure was. it worked. Every new page I create is instantly gobbled up by google with no problems at all.

Leave a Reply