filter:alpha(opacity=
80
);
/* Internet Explorer */
-moz-opacity:
0.8
;
/* Mozilla Firefox (legacy) */
-khtml-opacity:
0.8
;
/* Safari (legacy) */
opacity:
0.8
;
/* CSS3 Standard */
}
Someday when CSS3 goes completely live, we won’t have to worry about opacity not validating; however, what about other browser specific styles that we need in order to support multiple browsers? If you don’t care about passing W3 CSS Validation, then this article is not for you. But for those of you whom require validation, or have a perfectionist nature, can workaround the validation engine using both JavaScript and CSS.
Although it may not be the best method out there, using JavaScript to implement impossible-to-validate CSS works the best for me.
Why does it work? Well its simple. Most bots/spiders don’t render JavaScript, which I imagine is for a number of reasons. Perhaps rendering JavaScript would slow down the bots’ very purpose of data mining (or with bad bots, the purpose of infiltrating). In fact, there usually isn’t anything valuable to a bot that would come from having to render JavaScript. (As a side note, this doesn’t mean that their aren’t bots that seek out sites that have vulnerabilities in their JavaScript markup, but reading JavaScript and rendering JavaScript are completely different).
So first, its a matter of migrating browser specific code to its own spreadsheet (so our JavaScript can include it).
/* invalidable.css */
#selector {
/* random properties */
zoom:
1
;
-moz-border-radius
/* opacity properties */
filter:alpha(opacity=
80
);
/* Internet Explorer */
-moz-opacity:
0.8
;
/* Mozilla Firefox (legacy) */
-khtml-opacity:
0.8
;
/* Safari (legacy) */
opacity:
0.8
;
/* CSS3 Standard */
}
Once you have your browser specific properties on its own stylesheet, its just a matter of creating a JavaScript file that will “dynamically” insert the <link type=”text/css” rel=”stylesheet” href=”/assets/styles/invalidable.css” media=”screen”/> into your page (thus keeping with XHTML Strict standards).
// invalidable.js <-- note the extension
//
// Dynamically Inserts CSS Link Tag
var
headTag = document.getElementsByTagName(
"head"
)[0];
var
linkTag = document.createElement(
'link'
);
linkTag.type =
'text/css'
;
linkTag.rel =
'stylesheet'
;
linkTag.href =
'/assets/styles/invalidable.css'
;
linkTag.media =
'screen'
;
headTag.appendChild(linkTag);
Now all you have to do is throw in an JavaScript include tag:
<
script
type
=
"text/<span class="
searchterm1">javascript</
span
>" src="/assets/js/invalidable.js"></
script
>
Your done! Bots will now only get this:
<
script
type
=
"text/<span class="
searchterm1">javascript</
span
>" src="/assets/js/invalidable.js"></
script
>
…while visitors will get this:
<
script
type
=
"text/<span class="
searchterm1">javascript</
span
>" src="/assets/js/invalidable.js"></
script
>
<
link
type
=
"text/css"
rel
=
"stylesheet"
href
=
"/assets/styles/invalidable.css"
media
=
"screen"
/>
NOTE: It’s important to mention that visitors who have JavaScript DISABLED will NOT have the CSS file included, simply because the JavaScript won’t process; however, I will go on to say that its incredibly rare for you to have visitors that have JavaScript disabled. The only cases you’ll probably run into is visitors who visit your site from a cheap mobile phone (iPhone supports JavaScript), or a visitor who knows what their doing and has a Firefox plugin like “NoScript” installed.
On another subject, its important to mention that Google now penalizes for showing “different” content to search engines than to visitors. Would I classify an extra line in the section as different? Probably not. It’s my opinion that Google’s algorithm would check for differences in content, and not necessarily markup. I also haven’t seen very many cases where practices such as using JavaScript to “show/hide” content, be penalized in anyway.
I found this article more useful than many other complicated hacks.
ReplyDeletehttp://www.mc2design.com/blog/opacity-css-validation-using-javascript And somehow it is similar to your article... :P