• church website choices
  • church websites features
  • church website package
  • church websites provider

Our approach to web design goes beyond just creating great looking websites. We design powerful, effective, and functional user experiences that enable you to engage directly with your customers. We develop web pages that are optimized for quick downloading, allowing search engines to locate and index your content. In addition, we implement and customize content management systems that enable non-technical content providers to maintain and update website content easily.

 

To maximize our success, we utilize a custom design brief that helps us accomplish two important goals: (1) identifying all elements of our clients’ marketing and visual needs, and (2) understanding fully the goals and objectives they are trying to achieve.

 

If you’re interested in taking a look at our design brief, email or call us and someone will contact you right away.

CDS Web Services has partnered with Joomla in order to allow our clients to harness the full potential of the online sales medium. We’ve established this cooperative relationship with Joomla because their solutions are more than just a tool for catalog management and payments; their Commerce platforms incorporates the full scope of Joomla’s best-in-class web content management (WCM) platform to power your e-shop from end-to-end, fueling your business growth.

Our strategy, design, development and technology solutions are essential to business; they are second to none in terms of their effectiveness and the value they deliver.

We have a passion for simplifying complex, cumbersome business processes and revitalizing or streamlining marketing, business development and client interface activities through the smart application of interactive technologies.

At CDS Web Services we are experienced in optimizing websites in order to maximize your potential to be “seen” by search engines. While we cannot reliably guarantee a ranking position on specific pages based on user keyword searches, we can and will fine-tune your website so that it offers you the best possible opportunity to get great results through the search engines.

 

Below are the guidelines Google offers to help optimize your site for the Google search engine.

 

Google’s Webmaster Guidelines

Following these guidelines will help Google find, index, and rank your site. Even if you choose not to implement any of these suggestions, we strongly encourage you to pay very close attention to the “Quality Guidelines,” which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise penalized. If a site has been penalized, it may no longer show up in results on Google.com or on any of Google’s partner sites.

 

When your site is ready:

Have other relevant sites link to yours.

Submit it to Google.

Submit a Sitemap as part of our Google webmaster tools. Google Sitemaps uses your sitemap to learn about the structure of your site and to increase our coverage of your webpages.

Make sure all the sites that should know about your pages are aware your site is online.

Submit your site to relevant directories such as the Open Directory Project and Yahoo!, as well as to other industry-specific expert sites.

 

Design and Content Guidelines

 

Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.

Create a useful, information-rich site, and write pages that clearly and accurately describe your content.

Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

Try to use text instead of images to display important names, content, or links. The Google crawler doesn’t recognize text contained in images.

Make sure that your TITLE and ALT tags are descriptive and accurate.

Check for broken links and correct HTML.

If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

Keep the links on a given page to a reasonable number (fewer than 100).

 

Technical Guidelines

 

Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.

Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.

Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.

Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it’s current for your site so that you don’t accidentally block the Googlebot crawler. You can test your robots.txt file to make sure you’re using it correctly with the robots.txt analysis tool available in Google webmaster tools.

If your company buys a content management system, make sure that the system can export your content so that search engine spiders can crawl your site.

Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don’t add much value for users coming from search engines.

 

Quality guidelines

 

These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here (e.g. tricking users by registering misspellings of well-known websites). It’s not safe to assume that just because a specific deceptive technique isn’t included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.

 

If you believe that another site is abusing Google’s quality guidelines, please report that site through Google Webmaster Tools at https://www.google.com/webmasters/tools/spamreport?hl=en. Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. The spam reports we receive are used to create scalable algorithms that recognize and block future spam attempts.

 

Quality Guidelines – Basic Principles

 

Make pages for users, not for search engines. Don’t deceive your users or present different content to search engines than you display to users, which is commonly referred to as “cloaking.”

Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”

Don’t participate in link schemes designed to increase your site’s ranking or PageRank. In particular, avoid links to web spammers or “bad neighborhoods” on the web, as your own ranking may be affected adversely by those links.

Don’t use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google.


Quality Guidelines – Specific Guidelines

 

Avoid hidden text or hidden links.

Don’t employ cloaking or sneaky redirects.

Don’t send automated queries to Google.

Don’t load pages with irrelevant words.

Don’t create multiple pages, subdomains, or domains with substantially duplicate content.

Don’t create pages that install viruses, trojans, or other badware.

Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches such as affiliate programs with little or no original content.

If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.

If a site doesn’t meet our quality guidelines, it may be blocked from the index. If you determine that your site doesn’t meet these guidelines, you can modify your site so that it does and request reinclusion.

church website articleschurch website templateschurch websites faqs

Testimonials

Marilyn Brown
www.offthebonebbq.com
2013-10-06, 00:00
Makes Us Shine
It has been a pleasure to work with Steve as our webmaster. He has been with us almost for the whole seven years of our business. He … read more
Brenda Miller
www.themillerhouse.net
2014-04-19, 00:00
Talent and Integrity
Although I have never met Steve in person, I feel like we have been friends for years. Every time I have a computer question or … read more
Jeanne Wainer
www.parklandpros.org
2014-03-19, 00:00
Responsive, Diligent and Professional
We have used the services of Steve Gergens as the webmaster for Parkland PROs for the last two years. We are … read more