In At The Bleep End
Tuesday, October 27, 2009
Malware Hunting Handy intro to malware hunting here... http://synjunkie.blogspot.com/2007/11/hunting-malware-in-windows.html Monday, October 05, 2009
Google says valid sites don't get a ranking boost Most lists of the positive benefits of W3C validating your website HTML claim that it helps your site to rank better. They also usually claim validation makes a site more likely to work cross-browser/cross-platform. Now, Google's Matt Cutts has restated that W3C validation is not taken into account in Google ranking, and that one reason why Google.com doesn't itself validate is that it would make it less compatible with some obscure browsers/platforms... http://www.seroundtable.com/archives/020778.html (earlier article where he mentioned the ranking question here) Here's an article containing a list of major websites which fail validation... http://www.codinghorror.com/blog/archives/001234.html Sunday, October 04, 2009
Making Google maps from Google spreadsheets Making google maps from google spreadsheets Creating dynamic client-side maps mashups with google spreadsheets Google's search engine optimization starter guide Link Building Fundamentals http://www.bruceclay.com/blog/archives/2008/10/link_building_f.html Includes links to various tools... "Link Building Tools: Link Harvester pulls back all the information in an Excel spreadsheet. It gives you the number of unique linking domains, the number of unique Class C Block, the number of .edu/.gov/.mil links and how many are linking to the home page versus deep in the site. She recommends using the tool on competitors. Hub Finder compares the backlinks of two or more sites and points out their co-occurring back links. The Langreiter Tool compares Yahoo and Google rankings and where they agree or disagree. Search Status Tool turns all links using nofollow pink. It cuts your time down so you don't have to go to the code source to find out if the page nofollows links. The Bad Neighborhood Tool will give you an idea of links coming from risky neighborhoods. Google Alerts is a discovery tool that will let you know anytime one of your articles has been used and you can go there to see if they've linked to your site. The Utility Linking Tool will pull up directories and forums that you can submit your URL to based on your keyword." SQL Injection Filtering SQL Injection in Classic ASP SQL Injection Tester What to do if your site is vulnerable SQL Injection cheat sheet More - with and example Google Analytics Power User Guide http://blog.vkistudios.com/index.cfm/2009/6/5/The-Google-Analytics-Power-User-Guide Regular Expressions aka regex Regex is a way of searching for patterns in text using different conditions & wildcards. A nice intro to the concept here http://www.4guysfromrolla.com/webtech/090199-1.shtml More here... http://www.4guysfromrolla.com/webtech/RegularExpressions.shtml and this is an example of how to do it in Vbscript (ASP) hr1 = "text to be searched here" Obviously, lines 1 & 3 to be customised to whatever you are doing. SEO - why to avoid link farming http://www.seomoz.org/blog/whiteboard-friday-link-farming Making all links in Blogger open in a new window http://blogging.nitecruzr.net/2008/02/make-all-links-open-in-new-window.html ASP site search http://www.asptutorial.info/script/searchfiles/ This can be refined by adding marker 'start-search' and 'end-search' HTML comments in the code of each page you want to be searchable, then editing the string so only the text between these markers is searched. This removes false positive results given when text in meta tags and navigation is included. Standard Terms and Conditions for websites via Docstoc SEO tip - only the first link counts Make sure the anchor text of the first copy of the same link in a page is the one you optimise... http://www.seomoz.org/blog/results-of-google-experimentation... What to do if your email is blocked as spam by an ISP http://www.imediaconnection.com/content/7005.imc Getty Images legal claims For some time now Getty (and Corbis) stock photo publishers have been using a standard, and presumably very profitable, approach towards websites which use images from their libraries without paying. They claim it matters not whether the infringer was aware they were using images which should have been licensed. They use a company based in Israel to find the infringing images, using image searching software, then they claim outlandish sums from the infringer, with the threat of a larger claim via legal action made. What happens next can take various paths, detailed here... http://www.fsb.org.uk/discuss/forum_posts.asp?TID=1106 As stated on the forum, Getty may have tried to take only one company to court over this. The case was settled before reaching court... http://www.out-law.com/page-10367 ...so we still don't know how the courts would assess such a claim. Testing PC memory with Memtest86+ http://icrontic.com/articles/diagnose_with_memtest86 |