Friday, October 17, 2008

GOOGLE CROME



It is a new web browser developed by google.A beta version for Microsoft Windows (XP and later only) was released on 2 September 2008 in 43 languages. Mac OS X and Linux versions are under development.
The first release of Google Chrome passed the Acid1 and Acid2 tests. While it has not yet passed the Acid3 test, Google Chrome scored 78/100—higher than both Internet Explorer 7 (14/100) and Firefox 3 (71/100), but lower than Opera’s 84/100. When compared to equivalent “preview” or beta builds, Chrome scored lower than Firefox (85/100), Opera (91/100), and Safari (100/100), but still higher than Internet Explorer (21/100).
Chromium releases :
On 15 September 2008, CodeWeavers released an unofficial bundle of a WINE derivative and Chromium Developer Build 21 for Linux and Mac OS X, which they dubbed “CrossOver Chromium” An unofficial workaround for use with Windows 2000 is also available.
Design :
Primary design goals were improvements in security, speed, and stability compared to existing browsers. There also were extensive changes in the user interface.[8] Chrome was assembled from 26 different code libraries from Google and others from third parties such as Netscape.
Security :
Chrome periodically downloads updates of two blacklists (one for phishing and one for malware), and warns users when they attempt to visit a harmful site. This service also is made available for use by others via a free public API called “Google Safe Browsing API”. In the process of maintaining these blacklists, Google also notifies the owners of listed sites who may not be aware of the presence of the harmful software.Chrome will typically allocate each tab to fit into its own process to “prevent malware from installing itself” or “using what happens in one tab to affect what happens in another”, however the actual process allocation model is more complex. Following the principle of least privilege, each process is stripped of its rights and can compute, but can not write files or read from sensitive areas (e.g. documents, desktop)—this is similar to “Protected Mode” that is used by Internet Explorer 7 on Windows Vista. The Sandbox Team is said to have “taken this existing process boundary and made it into a jail”;[26] for example, malicious software running in one tab is unable to sniff credit card numbers, interact with the mouse, or tell “Windows to run an executable on start-up” and it will be terminated when the tab is closed. This enforces a simple computer security model whereby there are two levels of multilevel security (user and sandbox) and the sandbox can only respond to communication requests initiated by the user.Typically, Plugins such as Adobe Flash Player are not standardized and as such, cannot be sandboxed as tabs can be. These often need to run at, or above, the security level of the browser itself. To reduce exposure to attack, plugins are run in separate processes that communicate with the renderer, itself operating at “very low privileges” in dedicated per-tab processes. Plugins will need to be modified to operate within this software architecture while following the principle of least privilege.Chrome supports the Netscape Plugin Application Programming Interface (NPAPI), but does not support the embedding of ActiveX controls. Also, Chrome does not have an extension system such as Mozilla’s XPInstall architecture. Java applets support is available in Chrome as part of the pending Java 6 update 10, which currently is in Release Candidate testing.
A private browsing feature called Incognito mode is provided as well. It prevents the browser from storing any history information or cookies from the websites visited. This is similar to the private browsing feature available in Apple’s Safari and the latest beta version of Internet Explorer 8.
The current beta uses an old version of WebKit - 525.13 - which is actually the same WebKit engine used by the old Safari v3.1. The current Safari version is v3.1.2, which fixed several critical issues, including the “blended threat” Carpet Bombing vulnerability. Google even mention that they use Safari v3.1 rendering engine in their own documentation.
Speed :
The JavaScript virtual machine was considered a sufficiently important project to be split off (as was Adobe/Mozilla’s Tamarin) and handled by a separate team in Denmark. Existing implementations were designed “for small programs, where the performance and interactivity of the system weren’t that important,” but web applications such as Gmail “are using the web browser to the fullest when it comes to DOM manipulations and JavaScript.” The resulting V8 JavaScript engine has features such as hidden class transitions, dynamic code generation, and precise garbage collection.[8] Tests by Google show that V8 is about twice as fast as Firefox 3 and the Safari 4 beta.
Several websites have performed benchmark tests using the SunSpider JavaScript Benchmark[1] tool as well as Google’s own set of computationally intense benchmarks, which includes ray tracing and constraint solving. They unanimously report that Chrome performs much faster than all competitors against which it has been tested, including Safari, Firefox 3, Internet Explorer 7, and Internet Explorer 8. While Opera has not been compared to Chrome yet, in previous tests, it has been shown to be slightly slower than Fire fox 3, which in turn, is slower than Chrome. Another blog post by Mozilla developer, Brendan Eich, comparing the JavaScript engines in Firefox 3.1 and Chrome using the SunSpider test results, states that some tests are faster in one engine and some are faster in the other. John Resig, Mozilla’s JavaScript evangelist, further commented on the performance of different browsers on Google’s own suite, finding Chrome “decimating” other browsers, but he questions whether Google’s suite is representative of real programs. He states that Firefox performs poorly on recursion intensive benchmarks, such as those of Google, because the Mozilla team has not implemented recursion-tracing yet.Chrome also uses DNS prefetching to speed up website lookups.
For download google crome…click here
http://www.google.com/chrome/index.html?hl=en-GB&brand=CHMA&utm_campaign=en-GB&utm_source=en-GB-ha-apac-in-bk&utm_medium=ha&utm_term=google%20chrome

“How to do effective keyword research?”.

Recently I had a chance to use Keyword Elite SEO software. Indeed it’s a very user friendly keyword research SEO software. It has enabled me to create
- Position of keywords: Keywords are very essential for visitors to find what they’re looking for on your website. The best place to have these keywords is the title of the article of webpage and the top of the page in the title bar.
- Zero downtime: Visitors are impatient and there are loads of websites that can give them the information you’re giving. Added to that if your website is going to have downtime and slow pages, it is sure to make visitors disappear. 24×7 uptime is an absolute must on your website.
- Easy navigation: Nothing gives a website as much negative publicity as its navigation. If users are not able to find what they’re looking for quickly, they will make an exit from your website. Website design is critical in determining search engine popularity. If users are happy with your website design, search engines will automatically favor your website.
- The right keywords: Having the right keywords can mean the difference between an “also ran” website and one that is top of the list. Using overly common keywords risks the chance of being lost in the crowd while using less popular keywords means losing traffic altogether. It is best to use strategic, powerful keywords that stand out, add value to your website and describe what your website offers in as concise a manner as possible.
- Meta Tags: These are machine friendly descriptions of what your website is about. They are embedded in the backend HTML coding of a webpage and are used by robots that crawl websites to gauge the popularity of your website.
- Avoid overusing graphics: Graphics cannot be read by the search engine robots and hence by overusing these graphics you miss the potential to increase the search engine ranking of your website. It is best to use HTML code as much as possible.
- Regularly monitor website listing: It is important to frequently monitor the performance of your website in terms of search engine ranking. In this age of tough competition and websites scrambling to get listed on search engines, it is essential to know where you stand.
- Zero downtime: Visitors are impatient and there are loads of websites that can give them the information you’re giving. Added to that if your website is going to have downtime and slow pages, it is sure to make visitors disappear. 24×7 uptime is an absolute must on your website.
- Easy navigation: Nothing gives a website as much negative publicity as its navigation. If users are not able to find what they’re looking for quickly, they will make an exit from your website. Website design is critical in determining search engine popularity. If users are happy with your website design, search engines will automatically favor your website.
- The right keywords: Having the right keywords can mean the difference between an “also ran” website and one that is top of the list. Using overly common keywords risks the chance of being lost in the crowd while using less popular keywords means losing traffic altogether. It is best to use strategic, powerful keywords that stand out, add value to your website and describe what your website offers in as concise a manner as possible.
- Meta Tags: These are machine friendly descriptions of what your website is about. They are embedded in the backend HTML coding of a webpage and are used by robots that crawl websites to gauge the popularity of your website.
- Avoid overusing graphics: Graphics cannot be read by the search engine robots and hence by overusing these graphics you miss the potential to increase the search engine ranking of your website. It is best to use HTML code as much as possible.
- Regularly monitor website listing: It is important to frequently monitor the performance of your website in terms of search engine ranking. In this age of tough competition and websites scrambling to get listed on search engines, it is essential to know where you stand.