Tuesday, November 15, 2011

American Censorship Day is this Wednesday — And You Can Join In! | Electronic Frontier Foundation

This Wednesday, November 16, the disastrous "Stop Online Piracy Act" (SOPA) heads to the House Judiciary committee. In case you need a refresher, SOPA could allow the U.S. government and private corporations to create a blacklist of censored websites, and cut many more off from their ad networks and payment providers. This bill is bad news, and its supporters are trying to push it through before ordinary citizens realize just how much damage it can cause.

That’s why we're proud to announce that, along with groups like Public Knowledge, the Free Software Foundation, and Demand Progress, EFF is joining Fight for the Future in an initiative called American Censorship Day. We're calling out the representatives responsible for this bill and letting them know that this type of Internet censorship is, as TIME Magazine puts it, “a cure worse than the disease”. In fact, we think it’s much worse.

If you run a website, you too can join in the protest. One easy way is to go to the American Censorship Day website, which Fight for the Future runs, and follow the instructions there to grab their code to embed on your page. On Wednesday, that code will give visitors the chance to write or call their representatives and sign up for future updates from Fight for the Future without leaving your site.  Starting Saturday, Fight for the Future will also post instructions on how to “black out” your site logo as a second method of protest.

You should review the details on the American Censorship Day page to make sure it works with your site and complies with its privacy policy. If you can’t take these actions, there are other ways to get involved: spread the word to your friends about how catastrophic this bill could be for Internet freedom, encourage other site owners to participate in American Censorship Day, and sign our petition to stop Internet blacklist legislation!

Court makes it official: You have no privacy online — Tech News and Analysis

Media_httpgigaom2file_jfbhx

Any comment?

Monday, November 14, 2011

The Metrics Behind Guest Blogging [Case Study] - by Web Design Company 352 Media Group

This Gainseville based company should be on your radar. When I last spoke to the owner they were looking for an intern. Look what a guest blog post did for their numbers!

Do You Know Google’s Official Stance On Mobile Search & SEO?

Earlier this month Google announced Go Mo, clearly labeled as “a Google Initiative”, as though it represents the official Google position on the value of mobile content and mobile sites.

But what is the official Google stance on mobile sites, search and SEO? Will having mobile content help in search results?

Many people will claim that Google has offered an official position on mobile search and SEO, but they don’t realize that someone else in Google has offered a different, sometimes even contradictory, position on the subject.

How many stances have people from Google offered on webmaster issues related to mobile search? Eight, by my count.

1.  Matt Cutts On Mobile Duplicate Content & Mobile URLs

In January of this year, Matt Cutts answered a question about mobile SEO and recommended using a mobile URL for testing purposes with URL redirects for Googlebot mobile for the mobile site and Googlebot for the desktop site.

He didn’t address smartphone users, who are the most active mobile users. He did, however, say one thing that Google has been consistent about: mobile content is not duplicate content, and if you redirect it to the appropriate bot you won’t be seen as cloaking. He reiterated this stance in his most recent video on cloaking.

2.  Google SEO Guide, Google Japan On Redirecting Mobile Content

The advice that Matt Cutts offered about redirects was taken from the Google Japan team, who in late 2009 recommended redirecting feature phone users to mobile sites through redirects.

This advice was reiterated in the section on mobile SEO in the Google SEO Starter Guide, published almost a year later, and illustrated above.

3.  Redirecting Traditional Mobile Content But Not Smartphone Traffic

When it comes to redirects, in February of this year Pierre Far of the webmaster team made a distinction between smartphone traffic and traditional mobile traffic that wasn’t made before.

He said that webmasters don’t need to do anything special for smartphone users, but it may make sense for some websites. He also said mobile sitemaps are not for smartphone URLs, but for traditional mobile URLs.

4. Redirecting Smartphone Traffic To Mobile Sites & Tablet Users To Desktop Experience

A little more than a month later, Maile Ohye of the Google Webmaster Team said that it’s reasonable to drive Android users to your mobile site, but that you should direct Android tablet users to your desktop content, and smartphone users to your mobile content.

But didn’t Pierre Far just say no redirects were necessary for smartphone searchers?

5. Google’s John Mueller On Single URL Mobile Strategy

John Mueller of the Google Webmaster Team took questions on his Buzz page (now on Google+) in which he recommended a single URL mobile strategy for smartphone content in order to reduce redirects and make the experience faster for mobile users.

He then clarified his position on Search Engine Roundtable, saying “If the touch.example.com site is significantly different that it covers a special niche, then maybe that’s ok [to index separately and not add rel=canonical to].”

  • Update – John Mueller later said on his Google+ page that either mobile URLs or desktop URLs formatted for mobile are fine with Google.

6.  Providing A Fast, Relevant & Simple Mobile User Experience

In late June of this year, Google’s search quality head, Amit Singhal, said that Google is hyper-focused on getting the mobile user experience right in mobile search, and didn’t make a distinction between mobile search being feature phone as opposed to smartphone traffic.

He said that Google focuses on making a fast, relevant and simple mobile user experience, and that’s why they’re poised to excel at mobile search.

In March of this year, the Page Speed team also said speed is more important for mobile users.

7.  Google’s Scott Huffman On Blended Mobile Ranking Algorithms

Google engineer Scott Huffman revealed in Searchology 2009 that Google has ways of presenting content that they think is more relevant to mobile users for certain queries than desktop users, and confirmed this in a New York Times article this year.

Separate tests by Resolution Media and Covario both confirmed that mobile smartphone ranking differs from desktop rankings. Yet it’s unclear whether having mobile optimized content is actually a ranking factor in mobile search, since Pierre Far and others claim that desktop sites are adequate for smartphone browsers.

8.  Building Mobile Specific Content Rather Than Transcoding Desktop Experience

The most recent Google employee talking about mobile is probably Avinash Kaushik, who gave a fantastic webinar recently called Re-think Mobile Marketing & Analytics.

The webinar focused on creating extraordinary mobile experiences that add to the brand value rather than detract from it. His basic premise was to create desktop content for desktop users, mobile content for mobile and smartphone users, and tablet-optimized content for tablet users, and to do it in a way that takes advantage of what that specific device can do.

This is a lot of the same message that we also get from the best practices on the Go Mo site, but it begs the question: if mobile content is so important for mobile users, why does Google show so many unusable desktop results to mobile users in mobile and smartphone search?

Final Thoughts

So there are eight perspectives from Google employees on mobile sites and mobile search. There could be more, but these are the ones I know of. This is why it’s baffling when some writers claim that mobile SEO is a myth because one Google employee gave one opinion. If you really want to understand what Google the organization thinks about mobile sites and search, take the sum total of what they’ve said and try to make sense of it.

Granted, not all of these stances are mutually exclusive.

For example, you could build a fast site with a simple interface that’s extremely relevant to mobile user queries, as Amit Singhal suggests is necessary for mobile searchers, and you could build that on a mobile URL like m.domain.com as Matt Cutts suggests.

However, some of them are mutually exclusive.

For example, Maile Ohye and the Android Dev team say redirect smartphone traffic like the Nexus One to mobile content, but Pierre Far says no redirects are necessary for smartphone content.

Also, if you build a fast site with a simple user interface that’s extremely relevant to user queries, why would you give smartphone users desktop content, which is likely to be slow-going pinching and zooming through tiny text to find a result?

Some clarification is needed here to help webmasters serve the right mobile content to the right searchers, as the answers that have been given so far often cause more questions than answers.

Finally, I’m not suggesting that marketers need to wait until Google validates our mobile SEO strategies in order to keep optimizing.

On the contrary, things like doing mobile keyword research to understand how the mobile user experience differs from the desktop user experience and catering your content to that user experience is just good SEO, even though the user is mobile.

But Google, if you’d like to clarify your position on mobile sites, search and SEO, as Bing did last month, the webmaster community might find it easier to digest the sometimes contradictory positions above.

Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.

Related Topics: Google: Mobile | Mobile Search | Search Engines: Mobile Search Engines | SEO: Mobile Search

Bing On Mobile Search & SEO

HTML5: The Technology Changing the Web

HTML5 101

6 Best Practices for Modern SEO

Its a good thing we all like change.

Wednesday, November 9, 2011

Friday, November 4, 2011

Researchers Defeat CAPTCHA on Popular Websites | PCWorld

Researchers from Stanford University have developed an automated tool that is capable of deciphering text-based anti-spam tests used by many popular websites with a significant degree of accuracy.

Researchers Elie Bursztein, Matthieu Martin and John C. Mitchel presented the results of their year-and-a-half long CAPTCHA study at the recent ACM Conference On Computer and Communication Security in Chicago.

Researchers Defeat CAPTCHA on Popular WebsitesCAPTCHA stands for 'Completely Automated Public Turing test to tell Computers and Humans Apart' and consists of challenges that only humans are supposed to be capable of solving. Websites use such tests in order to block spam bots that automate tasks like account registration and comment posting.

There are various types of CAPTCHAs, some using audio, others using math problems, but the most common implementations rely on users typing back distorted text. The Stanford team devised various methods of cleaning up purposely introduced image background noise and breaking text strings into individual characters for easier recognition, a technique called segmentation.

Some of their CAPTCHA-breaking algorithms are inspired by those used by robots to orient themselves in various environments and were built into an automated tool dubbed Decaptcha. This tool was then run against CAPTCHAs used by 15 high-profile websites.

The results revealed that tests used by Visa's Authorize.net payment gateway could be beaten 66 percent of the time, while attacks on Blizzard's World of Warcraft portal had a success rate of 70 percent.

Other interesting results were registered on eBay, whose CAPTCHA implementation failed 43 percent of the time, and on Wikipedia, where one in four attempts was successful. Lower, but still significant, success rates were found on Digg, CNN and Baidu -- 20, 16 and 5 percent respectively.

The only tested sites where CAPTCHAs couldn't be broken were Google and reCAPTCHA. The latter is an implementation originally developed at Carnegie Mellon University and bought by the Internet search giant in September 2009.

Authorize.net and Digg have switched to reCAPTCHA since these tests were performed, but it's not clear if the other websites made changes as well. Nevertheless, the Stanford researchers came up with several recommendations to improve CAPTCHA security.

These include randomizing the length of the text string, randomizing the character size, applying a wave-like effect to the output and using collapsing or lines in the background. Another noteworthy conclusion was that using complex character sets has no security benefits and is bad for usability.

Bursztein and his team have also had other breakthroughs in this field in the past. Back in May, they developed techniques to successfully break audio CAPTCHAs on sites like Microsoft, eBay, Yahoo and Digg and they plan to continue improving their Decaptcha tool in the future.