I am not an SEO expert but here are my suggestions.
There are a number of ways to optimize Google (or just about any public search engine) results.
The very basic tool at disposal is robot.txt [1], which is a plain text file search by every search engine in every web root and / or web folder. In this file you can specify which search engines are allowed, while files and folder to be index, their priority and change frequency etc. In addition to that you force search engine to index specific urls only (by using a sitemap file in xml format. [2]).
Secondly you can configure Apache (or whatever web server being used by the website), to include specific HTTP headers, which inform both search engine and client browsers about caching and content expiry [3], after which search is forced to re-index the pages.
Thirdly you can use certain HTML meta tags on per page basis for defining correct title, description etc. of the web page in question [4]. This will improve results significantly.