SEO Question: Indexing Website Search Results |
Neustar Domain Names

SEO Question: Indexing Website Search Results


I have a search engine optimization question, and I couldn’t find it elsewhere. I know there are a number of pro SEOs who read my blog during some free time, and I am hoping someone can provide some feedback/advice for a site I am currently developing.

I am building a search based directory right now, and people can search for providers by city/state or zip code, which will yield the results pages, some of which will hopefully be filled with my advertisers. Since the pages will only technically exist when people search for them, will they be indexed in Google? If I create a site map with text links to all US cities, states, will that be sufficient?

For example, say I am developing, and the results page for Chicago would be found when someone searches, yielding this url: Would the search result be indexed in Google – and if not, how can I be sure to get it indexed in Google/Yahoo for the keywords?

Thanks if you can help!

About The Author: Elliot Silver is an Internet entrepreneur and publisher of Elliot is also the founder and President of Top Notch Domains, LLC, a company that has sold seven figures worth of domain names in the last five years. Please read the Terms of Use page for additional information about the publisher, website comment policy, disclosures, and conflicts of interest.

Reach out to Elliot: Twitter | | Facebook | Email

Comments (14)

    George Pickering

    First you should mask the dynamic URL and make it looks static

    Then you can do a sitemap, but I would put the largest cities on the first page

    then do a city page like this

    November 15th, 2009 at 8:27 pm

    George Pickering

    November 15th, 2009 at 8:28 pm

    dan gluckman

    Create a city navigation section using links to the search result pages for each city. ( /?s=Chicago%2C+IL ).

    Any page you want indexed should be linked to from pages that are already indexed.

    cheers. -d

    November 15th, 2009 at 8:29 pm

    On brand new site that just get launched, it is a waste of time and effort…

    1) these pages simply wont rank


    this entire setup is just way too spammy in the eyes of the engines and can actually be counter-productive

    On more established sites that have been around for a while which are trusted and well liked by google for example, you can get any dynamic pages ranked easily almost infinitely but you are trying to jump the gun, so in conclusion… Forget about it šŸ™‚

    November 15th, 2009 at 10:45 pm


    Google can’t find a page that “might exist” if someone types it in – it needs to be able to find the page on it’s own via a link or sitemap file.

    The above solutions are good, and you could probably also have someone write a script to build links to these pages for the top 2000 cities in america etc, and give links to them.

    November 15th, 2009 at 10:46 pm

    Mark Fulton

    Hey Elliot, this is why link relationship “canonical” tag was created. Here is more information:

    Best of luck with development.

    November 15th, 2009 at 11:19 pm


    This post on Matt Cutts’ blog might be of some use in predicting how Google may treat search results pages in the index:

    November 15th, 2009 at 11:59 pm

    Get Business Online

    I go along with the canonical URLs too (permalinks?).

    I’ve heard from an established directory they use a shadow SEO-optimized site to get indexed, using text-only pages, which are quick. Possibly checking if it’s a crawler or not, then serving the page.

    Also, the home page can serve your sitemap when it’s a crawler.

    Having said that, strictly speaking, Google specifically frowns upon any “fudging” of pages for its view only, so maybe you can do this for a start and then go “straight”.

    Good point about duplicate pages – gotta make sure title, keywords and description vary enough from page to page.

    November 16th, 2009 at 12:19 am


    Its true that with too many pages appearing rapidly you will trigger a duplicate content filter, you should build it up slowly and only put a page up once its completely finished with its unique content etc.

    November 16th, 2009 at 12:38 am

    net developers

    you must create your dynamic url to seo friendly url so first off all you make one comman service or product page like . Ok when you have new entry and create new page or url of this site then you redirect this new url on above url. so after some time your problem will be solve surely

    November 16th, 2009 at 1:20 am


    I have noticed that this site search results are indexed in google and rank well. Not sure what there secret is though.

    While your on the subject of SEO, I noticed lately that the url’s of your pages include a number at the end of it. Is this for seo purposes?

    November 16th, 2009 at 2:43 am


    If you are wanting to be found for a “certain service” like “local plumber” & “certain city” there is a easier way.

    I run such a site for a different ” Service” . The trick is to get listed in the index first. then build 20 – 30 backlinks which promote identical services. After aprox. 3 – 4 months build a text rollover, listing the main city’s on your front “Home” page.

    This is how I solved the problem it worked for me.

    November 16th, 2009 at 6:09 am

    Steve Morris

    The way to do it is create a dynamic page listing your citys with a links to the corresponding page. submit the page to bookmarking sites and place a link in your blog. Goggle can now index dynamic content quite well but for SEO purposes I would look at url rewriting.

    Don’t have more then 100 links on a page if you need to create more pages do so. I have successfully used this method in the past to get directory style sites indexed.

    November 16th, 2009 at 8:54 am


    I have quite a large directory site that has every page indexed and crawled regularly. Before I built it I was tempted to use a CMS or have the results in a PHP database. I chose to hand create each page in bog standard HTML so google would have no problems accessing it all. It worked for me. I was up against tough competition with established big directory sites but my little static site is now up there with them.

    Is updating a nightmare? No, not really. I have cPanel and in there I use the file feature. If i get a new listing it takes me no longer to create the new listing or page than it would using a cms – I actually I think I can do it quicker now.

    Although its a static site I have added widgets and other stuff which make it more dynamic. The beauty of it though is I intended every page to get crawled no matter how many levels the file structure was. it worked. Every new page I create is instantly gobbled up by google with no problems at all.

    November 16th, 2009 at 10:59 am

Leave a Reply

Name *

Mail *