eGrove Systems Corporation - PrestaShop Development Services
21 de Aug de 2012•0 gostou
1 gostaram
Seja o primeiro a gostar disto
mostrar mais
•461 visualizações
visualizações
Vistos totais
0
No Slideshare
0
De incorporações
0
Número de incorporações
0
Baixar para ler offline
Denunciar
Tecnologia
Design
Hire PrestaShop Developers from eGrove for all kind of PrestaShop Development Services, Migration, Customization, Integration, Themes and Modules Development
What is SEO?
Search engine optimization is a technique that helps to improve
visibility of our site through coding and content.
Analytics and Web intelligence
Keyword and content
On-page SEO & Site architecture – technical
Link development or off page SEO – mostly non technical
Technical SEO
Site architecture
Providing the readable content to
search engines
Avoiding duplicate issues
Delivering the content faster to users
and search engines
Improving user experience
Site Architecture
Site need to be designed based on the
intended target audience
For example if the site is for kids then
color, theme and font size need to be
selected to attract kids. Kids also feel
comfortable in spending time in the
site.
Mostly Business analysts, Project
managers and Designers will take
care of this part
How to provide content to search engines
Unique URL- SEO friendly
Meta data to provide details about the
page
XML site maps for regular pages and
for videos
Updating the sitemap if there is a
change in the site
URL is important for good ranking.
Best practices to follow for URL
structure:
Relevant, compelling and accurate.
Use hyphens to separate words and
don’t use underscores.
URL to have primary keyword. Adding
secondary keywords will be an additional
value.
Limit URL slug to 3 -5 words.
Here the following URL has 2 slugs.
http://www.egrovesys.com/application-development/prestashop-
development.html
Whereas the following URL has 7 slugs. It is using H1
Programmers should not use such default functionality and need to
use a solution for this.
If your SRS doesn’t say about this, clarify with Xavier.
Try to limit 3 levels deep unless clients are specific about this.
Title and Meta description are very important and
need to follow SEO guidelines
Title should not exceed 65 characters and Meta description
should not exceed160 characters. Otherwise the content provided
in the Meta will be truncated if they are used by search engines.
The Open graph protocol
OG will help to control how a link appears
in Facebook when a page is shared or liked
there. So they have a value in FB, but none
in search engines.
◦ Since Facebook is growing, more and
more clients are interested to have OG
implementation in their sites. I
recommend you to raise question if there
is no mentioning.
Avoiding the duplicate content, title and
description
Duplicate issue is one of the serious
issues that programmers need to
avoid
Some example sources:
http://www.example.com and http://example.com
www.example.com and www.example.com/index.html
www.example.com and www.example.com?session-id=1234
www.example.com/1 and www.example.com/1/
301 redirect
301 redirect will help if any page is no
longer required and can be permanently
redirected. Advantage with this practice is
there won’t be any loss in the link values.
For example egrovesys.com and
egrovesys.com/index.php render the same page.
See it from Screenshots in next Slide
Canonical element
Canonical element will be helpful to avoid duplicate issues arising out of URL
generation with session ids, query parameters and tracking codes.
Example: Pages with session ids generate duplicate title issues in this example:.
http://www.thisoldhouse.com/toh/article/0,,1147475,00.html
http://www.thisoldhouse.com/toh/article/0,,1147475,00.html?xid=hinewsletter 081908-47-skills
Need to implement canonical back to the preferred URL to resolve this issue.
Need to add the canonical element within the head section.
<link rel="canonical" href=" http://www.thisoldhouse.com/toh/article/0,,1147475,00.html"/>
* Business analyst and programmer can add this feature as an additional scope in the project
development.
Pagination handling
Pages in series or galleries will normally generate
duplicate title and description issues which can be
avoided by using rel=‖next‖ and rel=‖prev‖
Let us See In detail from Following Screenshots..
Pagination Screenshots Explanation
The first page only contains rel=‖next‖ and no rel=‖prev‖
markup.
Pages two to the second-to-last page should be doubly-
linked with both rel=‖next‖ and rel=‖prev‖ markup.
The last page only contains markup for rel=‖prev‖, not
rel=‖next‖.
rel=‖next‖ and rel=‖prev‖ values can be either relative or
absolute URLs (as allowed by the<link> tag). And, if you
include a <base> link in your document, relative paths
will resolve according to the base URL.
rel=‖next‖ and rel=‖prev‖ only need to be declared within
the <head> section, not within the document <body>.
No index Meta and robots.txt
―No index‖ meta will be useful if we
don’t want to index a page. robots.txt
can be used to block any particular
section of a site from crawling.
If the page is already indexed,
robots.txt will not have any impact. So
wherever possible use ―noindex‖.
Page load time
Page load time one of the factors that could influence
users to stay and do transaction in the sites. Some of the
areas where programmers can use their intelligence are:
Avoiding Excessive CSS in Head:
Placing CSS inside of head should be avoided for
helping spiders to reach the text quickly.
Example:
http://www.health.com/health/static/buzz/contests_and_giv
eaways.htm
External CSS file is recommended to handle such issues
like <link rel=‖stylesheet‖ type=‖text/css‖
href=‖externalcss.css‖ />
We Can See it from Screenshot in next Slide
Avoiding Excessive JS in Head
Pages that contain excessive java script need attention from
development team to find the possibility to move either to the
bottom or to the external file.
Google and other search engine spiders are more advanced
nowadays and can be able to detect page text even if there are
excessive java scripts. Time required to reach the text will be
the important factor.
Java scripts are unnecessary areas for spiders. Excessive java
scripts in the head will consume spider time with no reason. So
delivering the required text to spiders quickly by eliminating
lengthy java scripts ahead of body text will improve the ranking.
Example: http://www.health.com/health/anxiety
We can See it From Screenshot in Next Slide
Avoiding Excessive JS in Body
It is recommended to reduce the JS in the body to help
spiders to quickly crawl the page.
Page load time also improves if JavaScripts are
handled properly. In order to load a page, the browser
must parse the contents of all <script> tags, which adds
additional time to the page load. By minimizing the
amount of JavaScript needed to render the page, and
deferring parsing of unneeded JavaScript until it needs
to be executed, we can reduce the initial load time of
page.
Example: http://www.health.com/health/appendicitis
We can See it From Screenshot in Next Slide
Avoiding Excessive Whitespace
Minifying code is recommended which refers to
eliminating unnecessary space, new line
characters, comments etc.
Example:
http://www.health.com/health/library/mdp/0,,d04537t1,0
0.html
Following Heading Rules
H1 should come first in the source code and should be the first
Header tag parsed by any search engine crawler. Do not precede
the H1 with any other Header tag.
You should have only one (1) H1 tag per page. Thereafter, you can
have as many H2 – H6 tags as necessary to layout the page and its
content, but use a logical sequence and do not ―style‖ your text via
Header tags in your CMS.
---H1----
---H2-----
--H3—
--H3—
---H4--
---H2---
Custom 404 error page:
HTTP requests are expensive. So making an HTTP request and
getting a 404 error or ―not found‖ will slow down the user experience.
Some sites have helpful and creative 404 error page to cover bad user
experience. Still such pages waste server resources (like database,
etc). Particularly bad is when the link to an external JavaScript is
wrong and the result is a 404.
It is a good practice to keep 404 errors to minimum level through other
means like blocking unnecessary URL generation. As a final resort
301 redirects can be used. But such redirects should go to main page
or any other related page.
Google maintains that 404 errors won’t impact site’s search
performance, and can be ignored if we’re certain that the URLs should
not exist on our site. It’s important to make sure that these and other
invalid URLs return a proper 404 HTTP response code, and that they
are not blocked by the site’s robots.txt file.
Ajax implementation
Ajax implementation in the site needs to
follow Google guidelines to display AJAX URLs
in the search results.
For example:
www.egrovesys.com/portfolio#1 Should
become:www.egrovesys.com/portfolio#!1
Thank You
Part II:
Combining images:
Browser caching
Lossless compression of images
Inline Java script
Rich snippets for ratings and reviews
Moving a site to a new host
Ajax implementation