nav-left cat-right
cat-right

Determine URL of an ImagePlaceholder

I came across a problem recently whereby I wanted to determine the MCMS URL of an image stored in the MCMS Resource gallery.  The script had several image placeholders, some of which may of had images bound to them. My requirements involved determining the URL of any given image and then to use this in some CSS styling.   Faced with this scenario the following code should help: //Setup placeholder object and string for URL Placeholder pl; string imgSrcUrl = ""; //setup default CSS string string FeatureCSS = ””; pl = ImageX.BoundPlaceholder;//get the bound placeholder for this control if (pl!=null){//test the placholder is not null imgSrcUrl = ((ImagePlaceholder)pl).Src ;//get the source url from the placholder if (imgSrcUrl!=null && imgSrcUrl!=”"){ FeatureCSS = “<style type=\”text/css\”>\n#section {\nheight: 232px;\nbackground: url(”+imgSrcUrl+”) no-repeat -26px 0;\n}\n</style>”; } else doSomethingElse(); } else...

Welcome

to RJPargeter.com – SEO, Google Adword’s Management & Web Development. With extensive experience in the web site development and management field since the mid 1990’s working at organisations and companies such as USC, Sony, WGSN.com & University of Warwick, I believe I can offer realistic web site solutions for your business needs. Please browse my services via the navigation links or use the contact details throughout the site to enquire about your web site requirements and how I can...

PageRank no longer important?

I keep hearing people saying that Google’s Page Rank is no longer important and X or Y is much more important. I still believe that Page Rank is one of the most important factors in a site ranking well in the Google results. Firstly, Google still include a page rank on the Google navigation tool bar – ok this is not updated immediately but I would assume this is based on infrastructure and speed issues. Secondly Google went to all the trouble to patent the page rank rating algorithm through Stanford University and I would doubt they are going to just leave this behind. The page rank has been the foundation of Google success and although it is undoubtedly tweaked and used along side other algorithms it is still in my opinion the basis of the initial search engine rankings. Thirdly, when you hear Google employee Matt Cutts discussing page rank in such detail you know it is still fundamental to how Google works – http://www.mattcutts.com/blog/more-info-on-pagerank/ . As Matt discusses in this article pagerank in the toolbar is usually late to reflect how your site is ranking. This however is just the navigation tool bar representation of this. Really behind the scenes page rank is changing all the time as the index is added to, pages removed and back links calculated. So there you have it -PAGERANK is still alive and soldiering on. Whilst Mr Cutts still talks about pagerank we know it is still one of the main factors in Google search engine...

‘Black-Hat’ SEO Techniques

The following SEO techniques are known as ‘black-hat’ SEO and should be avoided by your web site. These techniques can sometimes provide very quick search engine result improvements, but over time are known to cause sites being banned and removed from indexes altogether. Hidden Text An old technique to increase keywords on a page was to include long lists of keywords and key phrases as hidden text on a page. This was sometimes achieved by placing the text far below the main content on the page or by displaying the text as the same colour as a background colour, i.e. white on white or black on black. This technique goes against Google’s webmaster guidelines. Cloaking or Doorway Pages Don’t deceive your users or present different content to search engines than you display to users. Matt Cutts (Google employee) describes a classic case of this on his blog site (http://www.mattcutts.com/blog/ramping-up-on-international-webspam/) whereby BMW displayed text to search engine robots whereas normal web users would be shown other content via the use of a JavaScript redirect. It is important to not try to do quick redirects on pages so that the search engine crawlers see certain content and then a user’s browser redirects to another page Link schemes Avoid links schemes that provide a massive increase in incoming links from bad neighbourhoods and other sites of dubious content. Participating in such schemes can result in your site being penalised. Automated Search Engine Submission Software Avoid using search engine submission software such as ‘WebPosition Gold” or other similar products. As long as your site is linked to from other sites and your site is up and running with a valid robots.txt file then pages and content will be indexed without the need for this software. Duplicate Content Avoid duplicating the same content on different pages. If Google detects large amounts of duplicated pages on different sub or main domains then you can risk a ‘duplicate content penalty’ which can result in the site losing rankings. Sub domains such as above are all treated as separate websites and if duplicate content is found then both sites can suffer ranking problems until the original origin site of the content is determined by Google’s algorithms. Content Solely for Search Engines Avoid publishing any content that is solely for search engine spiders. Content that is too rich in keyword density and unintelligible to a human can be detected by Google and other search engines and result in site penalties. Always write content firstly for humans and secondly for crawlers and robots. As long as the pages are...

SEO Part 14 – Web Based Sitemap

A site map enables search engines to easily find all content on the site by following all the links from the sitemap page(s) With the implementation of Google sitemaps, discussed in the ‘Google webmaster tools’ section there is no longer a huge need from a Google SEO point of view for a web based sitemap. For other search engines however it would be beneficial to have a few pages on the site that linked to all content currently published. A single page can have overheads on the system for large sites and for the front end user this can be too big to download and view – therefore several pages of links could make up such a sitemap. The sitemap start page should be linked to front the front homepage as this is the most common page that search engine spiders will...