260 likes | 391 Views
Lesson 14 - Unit N Optimizing Your Web Site for Search Engines. What should you learn from this lesson?. What are Search Engine Optimization (SEO) techniques that can be used so that people searching for a relevant link will be presented with a link to your web page(s)?
E N D
Lesson 14 - Unit N Optimizing Your Web Site for Search Engines
What should you learn from this lesson? • What are Search Engine Optimization (SEO) techniques that can be used so that people searching for a relevant link will be presented with a link to your web page(s)? • How do you write indexable content? • How do create a description with the meta element? • What are the steps to incorporate microdata? • What is a sitemap file and how do you create one? • What is a robots.txt file and how do you create one? • What are the steps in submitting your site to various search sites?
Understanding Search Engine Optimization (SEO) Search engine optimization: The process of tailoring the structure and content of a Web page with search engines in mind. • Two main benefits: • Increasing site’s priority in search results • Giving Web applications useful semantic information about your site
Algorithm: • Factors that search engines balance to decide the priority of search results combined into a set of instructions Microdata: • Used to mark up Web page elements • Covered in a working draft created by W3C for adding more types of semantic data to Web page content
Writing Indexable Content • To make Web site search engine friendly depends on the content • Even small adjustments can improve accuracy of indexing • Title, heading, image, and linked text elements play important roles • Table N-1 outlines guidelines for increasing effectiveness of elements for SEO
Adding a Description with the meta Element • Code added to provide information specifically for search engines • Page summary using meta element • Easy implementation • Specified various information to indicate character encoding of Web documents • Charset attributes • New meta data with description attribute
Given the following page complete the code for the meta tags of charset and description. Also add a title.
<meta charset="utf-8" /> <meta name="description" content="2012 season schedules for the Murfreesboro Recreational Soccer League competitive division" /> <title>Murfreesboro Regional Soccer League - Team Schedules</title>
Incorporating Microdata Vocabularies: Define keyword values for specific types of information Serve as common language for referencing data Anyone can define and use custom vocabulary Reference in code using URI Data-vocabulary.org contains popular, widely used vocabularies
Good references on Microdata • http://www.intuit.com/website-building-software/blog/2012/04/3-advanced-seo-techniques-you-need-to-know/ • http://www.vanseodesign.com/web-design/html5-microdata/
The following is found in the footer of a web page. Write the code for the microdata. <footer> <p id="contact"> </p> </footer>
Creating a Sitemap File Sitemap: • File in a specific format that lists all the pages in a Web site. • May include information about content. • The simplest version of Sitemap file contains only text.
Creating a robots.txt File Bots: • Also known as crawlers, they are programs that search engines use to index Web pages. • Influences which pages are indexed. • S.E. look for files named robots.txt • Should be located in the root folder of your web site.
The "User-Agent" part specifies which search engines you are giving the directions to. Using the asterisk means you are giving directions to ALL search engines. • The "disallow" part specifies what content you don't want the search engines to index. If you don't want to block the search engines from any area of your web site, you simply leave this area blank. • For most small web sites, those two simple lines are all you really need.
You can give some instructions about which content to avoid. A good example of this would be a site that has printer-friendly versions of all of their content housed in a folder called "print-ready." There's no reason for the search engines to index both forms of the content, so it's a good idea to go ahead and block the engines from indexing the printer-friendly versions. In this case, you'd leave the "user-agent" section alone, but would add the print-ready folder to the "disallow" line. That robots.txt file would look like this: User-Agent: *Disallow: /print-ready/
A word of warning. While some sites will tell you to use robots.txt to block premium content you don't want people to see, this isn't a good idea. While most search engines will respect your robots.txt file and ignore the content you want to have blocked, a far safer option is to hide that premium content behind a login. Requiring a username and password to access the content you want hidden from the public will do a much more effective job of keeping both search engines and people out.
Previewing and Finalizing Your Site Useful to give documents final check from bot point of view • Helps identify missing or hidden content • Ensures that pages are ready for indexing • Can install free programs to view only text • Can approximate view by changing browser settings
Submitting Your Site Webmasters: • People in charge of Web sites. • Web site can be simply published to make it available to potential users and bots. • Or you can submit the page directly to search engines.