Monday, November 30, 2009

SMSONE: Micro-Local News From India to Make Silicon Valley Jealous

Of the hundreds of companies I meet in any given country, I only write about a handful. Sometimes it’s the ones that seem to be copying a US idea, but in reality are building their company in a completely unique—and frequently more profitable—way. Other times, I’m captivated by an idea that’s perfect for an emerging market, but probably wouldn’t work in the US.

But every once in a while I find a company that hits the trifecta: It’s addressing a big problem locally, it’s something I don’t think is offered in the US, and…. I want it. And when a product in undeveloped, chaotic, messy India can make someone in Silicon Valley feel jealous, you know that entrepreneur has come up with something good.

I’m talking about SMSONE Media, a company I met in Pune about a week ago. Like most of the impressive companies I saw in India, it’s aimed squarely at the base of the pyramid and is using basic SMS to deliver services to people some of India’s most unconnected areas. It was started by Ravi Ghate, who proudly points out that none of his core team graduated from high school, much less attended an IIT or IIM. (Typically not something you brag about in India.)

SMSONE is basically a very-local newsletter. Ghate goes to a village and scouts out an unemployed youth—preferably one who’s had jobs as a street vendor or has experience going door-to-door shilling for local politicians. The kid pays Ghate 1000 rupees (or about $20) for the “franchise” rights to be the local reporter for that village. He goes door-to-door singing up 1,000 names, phone numbers and other basic information, then mails the slips to Ghate. Ghate enters it all his databases and all those “subscribers” get a text introducing the kid as their village’s reporter. In India all incoming texts are free so, the subscribers don’t pay anything.

And what readers get is pretty powerful. Right now there is no way to get a timely message to people in a village. There’s no Internet access, no TV, no local paper, and frequently no electricity. All they have is a basic mobile phone. SMSONE’s service can give farmers instant updates about crop pricing or news of a seed or fertilizer delivery a town away. That means the farmer only makes the trip when he knows the shipment is there, rather than wasting days of travel hoping the shipment is there.

Consider something even more fundamental: Water. Much of the villages have government-owned water pipes that are turned on for an hour or so once a day, or even in some areas once a week. Everyone has to bring their vats, pitchers and empty kerosene cans and get as much water as they can while the pipes are on. But these pipes don’t really run on a schedule so people frequently miss getting the day or week’s water. Now, SMSONE subscribers get a text when the pipes are about to be turned on.

I know it’s not as life-changing, but I’d pay to get micro-local, highly relevant news about my neighborhood in San Francisco in 160-character bursts, whether it’s about a power or cable outage, a construction project that’s disrupting traffic or details on a shooting that just happened. And I might even welcome local ads that report a hot new restaurant opening or a sale at a boutique two streets over. I feel like modern, uber-connected life has made us less interested in “local news” as we used to think of it on a city or region level, but more interested in the micro-local, hence the excitement in the Valley around Foursquare, CitySourced, and a host of location-aware IPhone Apps.

But the beauty of what Ghate has built is its simplicity. It doesn’t need a $300 smart phone and it doesn’t need GPS locators or a platform like Twitter to run on. Sometimes the most powerful innovation is built in the most extreme constraints.

I’m hardly the first to be impressed by what Ghate has created. He has won a host of awards including the Clinton Global Initiative’s YES Fund Award in 2008. And similar models are being built in parts of Africa where there’s similar mobile ubiquity and little else in the way of communications.

The change in life is not only pretty huge for subscribers. That once-unemployed kid suddenly has important local standing in his community. In addition to writing 160-character local news stories, he also sells local ads. Like a newspaper, Ghate enforces a ratio of ads to stories, so the news doesn’t get overrun by promotions.

The economics work out like this: Out of a 1000 rupee ad sale, 300 of it goes to the reporter, and Ghate pays him an additional 50 rupees for each news story. That adds up to a nice income for a village kid, but not so high that he picks up and moves to the big cities. Ambitious franchisees can even hire a few other reporters, expand their subscribers and make more money.

Right now Ghate’s operation is in 400 communities, reaching roughly 400,000 readers. He just got an investment from the government of Bangalore to boost that reach to five million readers in the next four months.

Ghate is clear that the money will be used strictly to reach more people. The company already breaks even and Ghate makes enough to pay his basic living expenses. He doesn’t care about fancy cars or clothes. It wasn’t too long ago that he was one of those disadvantaged kids, selling flags and berries on the side of the road and being told to go away. He still regularly travels between villages by bus and stays in $5/a night hotels. He’s promised to take me with him on my next trip to India, to see how the service works first hand and meet some of these young “reporters.”

“I’ll be back in February,” I said. “Will you have 5 million readers by then?”

“Not quite,” he said looking up at the ceiling, seemingly counting in his head. He looked down at me again, smiled and said, “Come in April.”

Source:

http://www.techcrunch.com/2009/11/30/smsone-micro-local-india-news/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29

Custom Web Design Chicago

Sunday, November 29, 2009

EU and Oracle To Meet soon on Sun Deal

The European Commission is delaying approval of the $7.4bn deal because of concerns that the combination of Sun's MySQL database product and Oracle's products could harm competition in the database market.

Oracle rejected the claim that MySQL competes with its core database software and said it will " vigorously oppose" the Commission's objections.

Oracle said the objections revealed a "profound misunderstanding of both database competition and open source dynamics".

The EU last week pushed back its deadline for reviewing the merger after granting a request from Oracle for more time to provide a counter argument to concerns over competition.

The US Department of Justice approved the deal in August, but had raised concerns over the future of Sun's Java software and programming language, not MySQL.

The UK Oracle User Group (UKOUG) has told the European commissioner for competition, Neelie Kroes, that its membership believes the future of both Java and MySQL will be secure with Oracle.

"Oracle, in terms of strategy and commitment to open source, will provide a secure future for Java," said Ronan Miles, chairman of UKOUG.

"Oracle's record of preserving customer investments and support of open standards would indicate as safe a future for MySQL as any other 'owner'," he added.

Source:

Chicago Website Design

Thursday, November 26, 2009

Analytics at Twitter

Last week I spent some time speaking with Kevin Weil, head of analytics at Twitter. Twitter, from a technology perspective, has had a bit of a hard time due to their stability issues in their early days. Kevin was keen to point out that he feels this was due to the incomparable growth Twitter was experiencing at the time and their constant struggle to keep up. Kevin was also keen to show that Twitter prides themselves on striving for engineering excellence, the creation & contribution to new technologies and generally assisting in pushing the boundaries forward. Our conversation naturally centered on analytics at Twitter.

Twitter, like many Web 2.0 Apps, started life as a MySQL based RBDMS application. Today, Twitter is still using MySQL for much of their online operational functionality (although this is likely to change in the near future – think distributed), but on the analytics side of things Twitter has spent the last 6 months moving away from running SQL queries against MySQL data marts. This was because their need for timely data was becoming a struggle with MySQL, particularly when dealing with very large data volumes and complicated queries. For Web 2.0 the ability to understand, quantify and make timely predictions from user behavior is very much their life blood. When Kevin arrived at Twitter 6 months ago he was tasked with changing the way Twitter analyzed their data. Now the bulk of their analytics is executed using a Hadoop platform with Pig as the “querying language”.

Hadoop is a distributed shared-nothing cluster which locates data throughout the cluster using a virtualized file system. What has made Hadoop particularly popular for large scale deployment is the comparative ease of writing distributed functions through a process known as map/reduce. Map/reduce hides much of the complexity of running distributed functions, even when running over a very large numbers of nodes. This allows the developer to focus on their “application logic” rather than worrying about specifics of the execution process (Hadoop handles distribution of execution, node failures, etc). But in saying this, expressing complicated application logic directly in map/reduce functions can become quite laborious as many pipelined map/reduce functions may be required to take raw data through to a useful processed result. Because of this complexity several higher level scripting languages have appeared to abstract this.

Pig is one such scripting language for Hadoop. Pig takes the developers requirement expressed in the script and produces the underlying map-reduce jobs that are executed on Hadoop. This abstraction is incredibly important as without it the complexity of expressing difficult analytical ‘queries’ directly in map/reduce would be highly time consuming & error prone. This can be thought of as being similar to the way SQL is a higher level abstraction language that hides all the query plan routines (written in C) that operate on the data in a traditional RDBMS. Of course abstraction provides increased efficiency in creating analytical routines, but comes at a performance cost. Kevin quantified his experience, he found typically a Pig script is 5% of the code of native map/reduce written in about 5% of the time. However, queries typically take between 110-150% the time to execute that a native map/reduce job would have taken. But of course, if there is a routine that is highly performance sensitive they still have the option to hand-code the native map/reduce functions directly.

Ok, so why use Hadoop and Pig instead of more traditional approach like an MPP RDBMS? Kevin explained that there were a few reasons for this. Firstly Twitter, like many Web 2.0 companies, is committed to open source and likes to use software that has a low entry cost but also allows them to contribute to the code base. Kevin mentioned that Twitter did look at some of the open source MPP RDBMS platforms but were less than convinced of their ability to scale to meet their needs at the time. And the second reason is exactly that, scale. Twitter is understandably coy on their exact numbers, but they have hundreds of Terabytes of data (but less than a Petabyte) and one could assume that to get reasonable performance they are running Hapdoop on a few dozen nodes (this is a guess, Twitter didn’t say). As they grow analytics will become more important to their business, this may expand to hundreds (or thousands) of nodes. A “few hundred” nodes is right on the upper limit on what is possible today with the world’s most advanced MPP RBDMS’s. Hapdoop clusters, on the other hand, grow well into the hundreds and even the thousands of nodes (e.g. at Google, Facebook etc).

So Hadoop was the platform choice, but why Pig? There are other “analytical” scripting languages that sit over Hadoop, notably Hive which was popularized by Facebook (Pig was popularized by Yahoo). On discussing the merits of Pig vs Hive it became apparent that Hive was more in tune with a traditional approach (“database like”). Hive requires data to be mapped to a given structure and the queries (using a SQL like derivative) are submitted against that schema. Pig on the other hand is less prescriptive in terms of schema and individual queries can define the structure of the data for that execution. In addition, Pig is more of a “procedural” language allowing the complicated data flow process to be more easily controlled and understood by the developers.

So, as mentioned, Hapdoop is a batch based job processing platform. Jobs (in this case map/reduce jobs generated from the Pig queries) are submitted and results are returned sometime in the future. Exactly when in the future varies from a few minutes (e.g. they run jobs hourly which only take a few minutes to run) through to many hours for jobs that run over much larger sets of data. This leaves a gap in “near real-time” analytics between the lightweight queries they can run on the transactional system and the more intense Hadoop based analytics. This has been a space that Twitter has been investigating solutions to fill. This space will be used for things like improved abuse detection, issue analysis and so on. Twitter is currently considering their data platform options here including Cassandra, HBase and may even decide to use a closed sourced MPP solution to fill this need (I can’t say what, sorry) due to the lack of suitable open source MPP alternatives.

For more technical info on Twitters use of Hadoop and Pig you can check out Kevin’s slide deck from the recent NoSQL East conference.


Source:

http://blog.tonybain.com/tony_bain/2009/11/analytics-at-twitter.html

Web Designers Chicago

Google Lays The Groundwork for Extensions in Chrome

Google is getting ready to offer widespread support for extensions in Chrome, launching a program which will allow third-party developers to add features to its browser.

The company released more details about its new Chrome Extension Gallery Tuesday. Developers can now upload their extensions to the Chrome Extensions Gallery, in effect publishing their extensions even before the browser officially supports them. Support for Chrome extensions is available in the current developer release, but they will probably arrive in the browser for all users before the end of the year.

At the moment, there isn’t much to see in the Chrome gallery — it’s just a form for developers to upload their extensions. However, Google is clearly hoping Chrome will one day support an extension ecosystem similar to the one Mozilla enjoys with its highly successful add-ons community for Firefox. The site will offer the ability to search and browse for extensions, and Google is encouraging developers to upload videos and screenshots explaining what each extension does.

The company has also posted guidelines outlawing things like copyright infringement, hate speech and any extension to “enable the unauthorized download of streaming content or media” — which means we probably won’t see extensions for ripping videos from YouTube.

To that point, Chrome extensions will be reviewed before they become publicly available. The Chrome Blog says that, for most extensions, “the review process is fully automated.” However, if your extension plans to use low-level components or access file:// URIs, Google will require a manual review and may ask developers for additional information before such extensions end up in the gallery.

Source:

http://www.webmonkey.com/blog/Google_Lays_the_Groundwork_for_Extensions_in_Chrome

Custom Web Design Chicago

Tuesday, November 24, 2009

FaceFaceTime's Database Acquisition Highlights Need for Web 2.0 Control

FaceTime Communications today announced that Check Point Software Technologies Ltd. has acquired its application classification and signature database to add a new level of security for use in Check Point Security Gateways. Check Point's purchase of the database validates the need for businesses to implement Web 2.0 controls and security.

FaceTime incorporates the database in its recently announced Unified Security Gateway 3.0 - the first secure Web gateway to combine content monitoring, logging, management and security of Web 2.0 applications, such as social networking, instant messaging, and Unified Communications, with URL filtering, anti-malware and Web anti-virus protection.

The momentous growth of Web 2.0 platforms and the benefits gained through their use introduces significant new compliance and policy challenges. Government agencies and corporations worry about sensitive information leaking out over Twitter or Facebook and organizations now face new rules, from regulatory bodies such as FINRA, specifically relating to content posted to social networks.

"The growth in corporate use of Web 2.0 Applications and in the number of different applications being used makes compliance with legal and regulatory obligations incredibly important", said Michael Osterman, principal of Osterman Research. "Our research has shown that a large and growing proportion of corporate users are using Web 2.0 tools on a regular basis. Consequently, organizations must control how these tools are used and the information that is transmitted through them or face significant consequences for not doing so."

Source:

http://web2.sys-con.com/node/1198097

Professional Web Design Chicago

New Oxford Names Facebook’s “Unfriend” as Word of the Year 2009

The New Oxford American dictionary has named “unfriend” as the 2009 Word of the Year, which literally means removing someone as a friend on popular social networking websites like Facebook and Friendster.

The word was chosen from a pool of finalists with a technology savvy lineage, Oxford said in a statement.

Oxford’s US dictionary program senior lexicographer Christine Lindberg said that the word was chosen among other finalist for its currency and potential longevity characteristics.

“It has an exact definition or meaning when being used as a verb or an action word by the online community. Its meaning is understood, because of this, the adoption of ‘unfriend’ as a modern form of verb makes it more interesting to be chosen as the Word of the Year,” Lindberg added.

Other finalists came from economic terms, technological trends, political, and current affairs as chosen by the Britain Oxford University Press, the publisher of the said dictionary.

For technology, “hashtag” was also named, which means a hash sign connected to a word or a phrase that allows users of popular microblogging website Twitter to search similar tweets; also chosen was “intexticated,” which literally means texting-while-driving; and “sexting,” which from the rootword means sending sexually explicit messages and MMS through cellphones.

Meanwhile, among the top choices for the economy terms were “freemium,” which means basic services that are offered free in some forms of business models; and “funemployed,” which refers to people or an individual taking advantage of his newly acquired status of being unemployed by pursuing or having fun with his own interest.

For the political and current affairs category, “birther, ”which refers to a person or a group conspiring or having a conspiracy theory regarding the birth certificate of United States President Barack Obama; and “choice mom,” which can be simply put as a single mother by choice.

Other words that made it to the shortlist of the Word of the Year 2009 were “deleb” or a dead celebrity; and “tramp stamp” or a tattoo on the lower back of a female.

Source:

http://www.socialnetworks10.com/new-oxford-names-facebook%E2%80%99s-%E2%80%9Cunfriend%E2%80%9D-as-word-of-the-year-2009

Web Design Firms Chicago

Monday, November 23, 2009

Firefox: Heat And The CPU Usage Problem

Firefox has a CPU usage issue and, consequently, can cause overheating problems in some laptops, particularly ultraportables. That's what I've found over the last couple of years.

But don't take my word for it. This is documented on a Mozilla support page entitled "Firefox consumes a lot of CPU resources." The page states: "At times, Firefox may require significant CPU [central processing unit] resources in order to download, process, and display Web content." And forum postings like this one about a Dell Netbook are not uncommon: "Mini9 would get way too hot."

The Mozilla support page goes on to say that "you can review and monitor CPU usage through specific tools" and describes ways to limit CPU usage, such as: "A Firefox add-on, called Flashblock, allows you to selectively enable and disable Flash content on Web sites."

Let me describe my experience. I find that tab for tab, Firefox uses decidedly more resources than other browsers--Safari, for example. And in the past (when I was actively using a Windows Vista-based machine) Firefox also compared unfavorably with Microsoft's Internet Explorer for CPU usage.

More specifically, here's the behavior as I see it. When I'm accessing sites with multimedia content such as the CNET front door, Firefox CPU usage will bounce around between 30 and 60 percent, and sometimes spike higher (80 percent and above), as indicated by the Mac OS 10.6.2 Activity Monitor.

On the other hand, the Safari CPU usage with the same pages open is much lower--typically between 2 percent and 10 percent.

My theory is that most users don't notice this because in mainstream laptops, this isn't an issue. But it can become an issue in ultraportables--typically under an inch thick--which are more sensitive to heat because of the design constraints. The ultrathin Apple MacBook Air, which I use as my main machine, is a good example.

The fan is usually an audible indicator of CPU usage issues. When I'm using Firefox and I have tabs open on multimedia-rich sites (which is par for the course these days), the Air's fan will almost invariably kick on and stay on until I close the tabs. As I write this, the fan has finally shut down after I closed the Firefox tabs (e.g, CNET front door). Those same tabs in Safari are still open and not causing any significant spike in CPU usage or fan activity.

When I contacted Mozilla, a technical support person guessed that Safari is possibly better at optimizing Flash-based sites compared to Firefox. And that may be true. However, I had similar issues before when I was using a Hewlett-Packard business ultraportable (also very thin like the Air) that were not necessarily tied to Flash usage. In short, Firefox was less efficient with CPU usage compared to Microsoft's IE 8. And the behavior was similar. The HP laptop would quickly heat up and the fan would kick on.

Finally, let me reemphasize that I'm guessing that most users don't notice this because heat dissipation is not a big issue for mainstream laptops that are not necessarily thermally-challenged when accessing multimedia-rich Web pages. That said, this has been a steady problem for me because I use ultraportables almost exclusively and has forced me to limit my use of Firefox.

Source:

http://news.cnet.com/8301-13924_3-10396076-64.html?part=rss&tag=feed&subj=Webware&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+webware+%28Webware.com%29

Professional Website Design Services

Friday, November 20, 2009

Sony Slides to 4-mth low, Plan Fails to Inspire

TOKYO/SEOUL (Reuters) - Shares of Sony Corp fell to their lowest in nearly four months on Friday after the electronics maker's new business strategy failed to convince investors it could deliver strong profit growth.

Sony, which is heading for its second straight annual loss, said on Thursday it would launch 3D TVs and networked products and services as part of a plan to boost its operating profit margin to 5 percent in the year to March 2013.

The 5 percent target had originally been set by CEO Howard Stringer in 2005 for the business year to March 2008. It narrowly missed the target that year before falling into the red on the economic slowdown and tough competition.

Market players remain sceptical whether Sony can achieve its goal of turning its video game and TV operations profitable next year as it struggles to compete with overseas rivals such as South Korea's Samsung Electronics Co Ltd.

"The strong yen makes it difficult for the Japanese to compete on prices, while the Korean's strength in volume and mass production technologies keeps them ahead," said John Park, an analyst at Daishin Securities in Seoul.

"Japanese TV makers like Sony will try hard to expand in LED-backlit TVs next year, but are likely to have an uphill battle with Korean leaders," he said.

Sony's TV business is in its sixth-consecutive year of losses as it grapples with a firmer yen and intensified competition from Samsung and LG Electronics Inc

Shares in Sony were down 2.8 percent at 2,400 yen after earlier hitting 2,375 yen, their lowest since July 29. The benchmark Nikkei average was down 1.3 percent.

The maker of Cyber-shot cameras and PlayStation games shed jobs, closed plants and sold non-core assets following the global downturn to cut costs, and investors were awaiting a convincing growth strategy from management ahead of Thursday's announcement.

Source:

http://in.reuters.com/article/technologyNews/idINIndia-44104820091120

Chicago Web Site Design Company

Thursday, November 19, 2009

Internet Explorer 9: Microsoft Reveals First Details

Microsoft, with the launch of Windows 7 complete, is turning its focus towards one of its most important components: Internet Explorer. With huge pressure from Firefox and Google’s Chrome Browsers, Microsoft has revealed the first details of Internet Explorer 9 at the Microsoft Professional Developer’s Conference in Los Angeles.

The new browser is very early in the development process. It’s only been in development for three weeks, as Windows 7’s release took precedent, thus there are not many details for IE9. However, the theme seems to be that IE9 is going to adhere and even improve upon web standards (albeit “responsibly”).

Some details:

- On HTML 5: Microsoft was coy about whether it would support all of the HTML 5 standards, the next generation of HTML. The company doesn’t seem willing to commit to the standard until it is set in stone, but “wants to be responsible” about supporting it.

- On Javscript: They admit that their previous browsers don’t match the speed of Firefox (Firefox) or Chrome (Chrome). However, it appears that IE9 looks to narrow this gap. From some of the data they presented, it looks like they’re getting closer to matching the other browsers (though they don’t beat them).

- On CSS Support: It looks like IE9 will finally get better CSS support, especially for rounded corners. It’s a disappointment though, when you consider the other browsers have supported these things for years.

- On Hardware Acceleration: IE9 will utilize DirectX hardware acceleration to improve graphic and AJAX rendering. It will push more work towards the GPU. This is actually looks pretty slick from first appearances.

Most industry experts believe that Internet Explorer (Internet Explorer) is a laggard in terms of compatibility, standards, and speed. Microsoft seems to be working to change that with IE9. The browser is a long way away from its release though, so there’s no telling what it will be able to do and whether it can match Safari (Safari), Chrome, and Firefox enough to stop its descent into oblivion.

Source:

http://mashable.com/2009/11/18/internet-explorer-9/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Mashable+%28Mashable%29

Chicago Web Design Firms

Wednesday, November 18, 2009

Yahoo Stopping Mobile 'Go' App in 2010

On Wednesday, Yahoo will tell some mobile phone owners that it's pulling the plug on the mobile app called Yahoo Go (video). Yahoo Go was Yahoo's all-in-one native app of Yahoo services for Windows Mobile, BlackBerry, and Symbian phones, since January 2006. It gathers together Yahoo's services around a rotating carousel motif, the application's start page.

Yahoo Go, which first emerged at the Consumer Electronics Show in 2006, was full of content--but information was buried and the app wasn't intuitive to customize. Yahoo pretty much halted work after January 2008 with Yahoo Go 3.0 beta, and began concentrating more on its Web portal. Yahoo's mobile-optimized Web site, m.yahoo.com, contains Yahoo Go's core features, like search, weather lookups, and RSS feeds for information like headline news and stocks. Yahoo's revamped mobile site also lets you check e-mail, send IMs, and track status updates on Social Networks.

Killing Yahoo Go is in line with Yahoo's mobile strategy, says Yahoo's global head of mobile product marketing, Adam Taggart. "In the past 18 months, Browser quality has been increasing at an accelerated rate. We've doubled down on our mobile Web strategy."

While Yahoo pours resources into streamlining its mobile Web presence, it also continues to release Yahoo Mobile applications for some mobile platforms, like the iPhone and BlackBerry. On top of Yahoo Mobile are more focused standalone applications. iPhone owners interested in stocks can download the Yahoo Finance app, for example. Sports enthusiasts have Yahoo Fantasy Football.

Support for Yahoo Go officially stops on January 12. On Wednesday, active users will see an e-mail or an update notice pushed onto the app itself that will inform them of the shut-down, and urge them to start using m.yahoo.com instead. Visiting the mobile site from some phone models will prompt a download for a compatible native app. Yahoo Mobile still isn't perfect, and it can also suffer from information overload. However, active Yahoo Go users will find that their content is intact, albeit somewhat rearranged.

Source:

http://download.cnet.com/8301-2007_4-10399819-12.html

Professional Web Design Chicago

Monday, November 16, 2009

I-Technology Viewpoint: It's Time to Take the Quotation Marks Off "Web 2.0"

That doesn’t mean there’s agreement yet on what the term means. This is one of the reasons we’re hearing about “enterprise resistance” to Web 2.0 Applications (more on this below).

Web 2.0, after all, means different things to different people:

  • To the programmer, it’s a set of tools and techniques that have the potential for fundamentally altering how network based applications and data are developed, managed, and delivered.
    For start-ups and venture capitalists, it’s an opportunity to get in on the ground floor of another “bubble.”
  • For the corporate CIO or IT manager, it’s another set of technologies and architectures to be adopted and supported in an era of continued I.T. department budget strains.
  • For newer or smaller companies, it’s an opportunity to acquire technical and business process infrastructure at a fraction of the cost of the investments made by older and legacy companies.
  • For the marketing manager it’s an opportunity to “end-run” a traditionally unresponsive I.T. department.
  • For the customer it’s an opportunity to establish and maintain relationships that are both personally fulfilling and empowering in the face of the traditional power of larger institutions.
  • For the CEO of an established legacy industry company, it’s a threat of loss of control over customer relations.

With so many perspectives, it's no wonder that it's difficult to get a clear picture. We’re dealing not only with shifting technical architectures but also with shifts in how individuals and organization use the Internet. We know that different industries adopt technology at different rates. In the case of Web 2.0, we're talking not just about changes in technology and associated business processes, but also about changes in the relationships that are built around how systems are developed and used.

Reality: Things Don't Always Work

My personal interest in this subject is anything but academic. As a management consultant I help companies plan and manage changes to Technology and processes. With Web 2.0 the opportunity for change is massive.

This change is not going to happen overnight. There’s always a risk when a new technical architecture is introduced. Part of the risk is getting components of the architecture into the hands of the users. Another is making sure those components work reliably.

I was reminded of this recently when my wife called about an email she had received that contained a link to something she couldn't read: "I'm trying to show an attachment to an e-mail I got from a client but I can't. I called her to ask what to do and she told me to make sure that 'flash was turned on'. What does she mean by "flash?"

I explained what "Flash" from Macromedia is and fixed her problem, but not until she had experienced a significant delay in communicating with a prospective client -- all because a popular web browser add-in wasn't properly configured.

This got me to thinking about the current evangelizing that is swirling around Web 2.0, AJAX, SOA, and tools like Ruby on Rails, a web application development framework. I've been impressed with the mass of available applications that offer sophisticated functionality without requiring a "heavy client footprint." Just check out Christian Mayaud's list of Web 2.0 applications if you want to be amazed (and amused).

I thought back to my wife's question. "Flash" is one of those "helper" applications that an entire industry and developer community has grown up around. It's now firmly a part of the Internet infrastructure. That wasn't always the case. As I saw with my wife's question, there are still "pockets" of users where an unknown – but simple -- configuration setting can cause the final step in a complex communication channel to fail.

Parallels with Web 2.0

This got me to thinking about what has to happen each time an AJAX based application is used. Some current "mashups" might be combining widely available public data. With "enterprise" types of applications we might be talking about the over-the-web handling of valuable -- or sensitive -- personal or financial data. Reliability and stability of all parts of the server, net, and client will be critical. All will have to work together to ensure reliable two-way interaction.

Is uncertainty about this reliability one of the reasons why some corporate IT managers are taking a wait-and-see attitude about "web 2.0"? My wife's temporary problem with a supposedly mature piece of the web infrastructure is probably not too unusual. There are many users out there who work day in and day out without paying special attention to extensions, helper applications, thin clients, RSS feeds, and the like. There are probably a lot of them for whom managing the vicissitudes of a commonly available component such as Flash is at best, an annoyance.

In theory, the population of users like my wife might be a prime target of the Web 2.0 delivered rich functionality for remotely served applications such as are sometimes referred to as "Web Office" or Office 2.0. Technologies such as AJAX (and Flash, ActiveX, and Java, as well) help deliver near-desktop-quality functionality (some may argue with the adjective "near") without requiring the permanent installation of massive amounts of (expensively-licensed) software on the client machine. I’m sure that many corporate I.T. folks view potential simplification of the client’s configuration as A Good Thing, right?

But lots of dice have to roll the right way for all parts of the channel to work every time. Sometimes problems occur, as my wife's experience attests -- and she was dealing with one of the oldest and most venerable components for delivering rich media content.

(I think of issues related to this every time I switch over from Yahoo's old-but-serviceable Web Mail interface to the Yahoo! Mail Beta. I'm using the Yahoo! Mail Beta via my office DSL connection, and I always have to wait for data to load. And while I love the "drag and drop" functionality the Yahoo! Mail Beta interface provides, the hesitation of the interface grates after a while, especially when handling as much daily e-mail as I do.)

Are Internet users willing to accept such performance glitches "outside the firewall" in order to gain access to an attractive interface and functionality that looks like it's running from a local client?

I suspect my wife won't. She may not be a “power user,” but her standards for performance are high. (I know - I hear about it whenever our home network slows down!)

Web 2.0 Enterprise Hurdles

If Web 2.0 applications built around AJAX and related technologies are to succeed in the "enterprise" several dice have to roll the right way, such as:Tools, development, and testing processes must continue to mature (this is happening). Tools, development, and testing processes must be accepted into the enterprise -- in addition to, or in replacement of, the architectures that are already there (e.g., how many development platforms is an IT department willing to support?) Data security and stability issues must be solved -- especially when it comes to handling sensitive customer and financial data.

The new architecture must deliver -- and have documented -- (a) reduced costs, (b) added benefits, or (c) both (a) and (b).

Company executives must be willing to accept a new network architecture paradigm along with its frequent association with "social networking" functionality that many people are still not comfortable with.

Except for the last bullet point, the issues here are similar to issues associated with the introduction of any new programming language or development framework into the enterprise. The costs of changes to process and technologies have to be outweighed by the promised benefits. In that sense, AJAX is no different from other evangelized technologies that have come before, except that the Web now provides (potentially) the data, delivery platform, and the medium for promotion (and hype) – outside the firewall.

More Than Technical Architecture Challenges

I’m going to make a leap of faith here and predict that issues of security, reliability, maintainability, privacy, and functionality that are associated with web based Web 2.0 applications are going to be successfully addressed and resolved. Call me optimistic, but there’s a lot of creativity out there and I believe that current technical challenges will be overcome, and quickly.

But let's return to that last bullet point in the above section. One of the most astute descriptions of Web 2.0 adoption is the recent blog article by Thad Scheer of Sphere of Influence called Monetizing Value of Social Computing in Traditional Industries. (Author disclosure: Thad is a friend of mine.)

Thad is the CEO of a DC-area consulting firm that serves both government and corporate customers. He bases his statements on what he sees as resistance from traditional "brick and mortar" industries to adopting a new customer relationship paradigm. According to Thad, the executives of these industries see Web 2.0 as reducing their control over their data and their customer relationships, especially when "social networking" functionality is included in the mix. (Some of my own Web 2.0 Management Survey interviews have borne this view out.)

While Thad goes on to assess the more willing acceptance of Web 2.0 by newer firms and more consumer product oriented firms, the issues he points out are not issues of technical architecture, they are issues of functionality and the expectations executives have of how they relate to their customers.

It’s entirely reasonable that certain population segments and certain types of business transactions will be thought by corporate executives as being outside the realm of either existing or Web 2.0 infrastructures. Not every business transaction can -- or should -- be handled electronically, nor does it need to be handled in a fashion that increases a sense of intimacy.

Change Comes Fast

Changes to social interactions and to network infrastructure usually can't happen overnight. But pressure to adopt more collaborative and interactive techniques based on "social software," even with populations that might have traditionally been thought of as being resistant to such approaches, might develop faster than some people think. In particular, knowledge workers inside and outside even the traditional industries will expect more conversational and interactive communications both within their companies and with the companies -- and customers -- they deal with. Management will need to adapt to the fact that employees are now able to engage with customers more frequently and on a more personal level than ever before. This engagement can lead to loyalty.

Companies will respond differently depending on their structures, management styles, and regulatory constraints. Some will have a small number of "CEO-style" blogs with commenting "turned off." Some will organize groups of staff members to interact in a structured fashion with different groups of users with management structures potentially modeled after high end professional call centers. Still others will engage all staff to interact as a normal part of their job (just like answering the phone). Some may adopt Web 2.0 technologies but only within the corporate organization itself.

In many cases, the I.T. department will be called upon to evaluate and if necessary provide support for technology platforms that may interface -- reliably and safely -- with other corporate systems.

Competition Spurs Change

Will the "brick and mortar" businesses that Thad writes about just sit by while smaller and more agile competitors, who have much less invested in legacy infrastructure, nibble away at their businesses?

I don't think so. It's from such competitors that the pressure on traditional companies to adopt "Enterprise Web 2.0" most likely will come.

I am not prepared to rule out the ability of even large “legacy” companies to adopt the faster, more open and agile ways of smaller competitors. In fact, the ability to effectively innovate and manage using collaborative technologies that incorporate social networking methods may turn out to be the next “big thing” in management, comparable to the quality improvement movements spawned in the 20th century by foreign competition to U.S. manufacturing. This might even rival – and compete with – the current resources being put into process and system outsourcing.

The wave of change will not be limited to manufacturing but will impact all sizes and types of industries that need to communicate with their customers. That’s everyone.

How to Move Forward

OK, so much for analysis, let’s get practical. The following is my advice to managers, executives, and employees who want to take advantage of Web 2.0 Technologies:

  1. small. Do a prototype, not an enterprise rollout. Focus on early results based on well defined, achievable goals.
  2. Involve both business and IT. Even if you can negotiate deals with externally software vendors without the involvement of your IT department, don’t do it. Your IT department probably has more experience with knowing what might go wrong than you do!
  3. Minimize integration complexity. Don’t start with a project that requires manipulation of the company’s “crown jewel” financial or customer data. Keep it simple.
  4. Focus on business benefits. Don’t do something because it’s free, easy, or nice to know. Focus on a system that supports the financial or strategic goals of your organization. It’s always easier to justify costs associated with money making activities than costs associated with overhead and administration.
  5. Know your costs. Even if you’re using existing staff, existing networks, existing hardware, and free software, don’t pretend that time and infrastructure are free. Keep track of staff, user, and support staff time. If you’re successful with your initial project you’ll have to translate these into dollars.
  6. Use the technology. Practice what you preach. If you’re experimenting with web based tools for customer communication, use web based tools to support project management and staff communication.
  7. Face issues headlong. Have a process to track and resolve all business, personal, and technical issues that arise during the course of the project.
  8. Don’t demonize the opposition. Opposition to Web 2.0 may be justified. Listen and find out what the core issues really are, surface them, and address them.
  9. Remember it’s a business. Even if you’re working with collaborative software that supports personalization, relationships, and intimacy, remember that there needs to be a reasonable line drawn between business and personal matters.
  10. Manage. Don’t just toss the technology into the user community. Be prepared to guide, give feedback and, where necessary, lead. Remember: no matter what project management philosophy you follow, projects don’t manage themselves, especially projects where you are breaking new technological or cultural ground.

Source:

http://web2.sys-con.com/node/207411

Professional Web Design Chicago

Sunday, November 15, 2009

Murdoch to Hide News Corp Content From Google Within Months

A couple of days ago, in an interview with Sky News Australia, Rupert Murdoch explicitly said he plans to make News Corp’s content invisible to search engines.

Now, News Corp’s chief digital officer Jonathan Miller, has revealed a timeframe in which this is supposed to happen. Speaking at the Monaco Media Forum, Miller said it will happen within “months and quarters – not weeks.”

He then repeats Murdoch’s mantra, claiming Google’s (Google) search traffic is next to useless. “The traffic which comes in from Google brings a consumer who more often than not read one article and then leaves the site. That is the least valuable of traffic to us… the economic impact is not as great as you might think. You can survive without it,” he said.

It may seem like quite a crazy idea, but News Corp plans to bring other media companies along for the ride. “We will lead. There is a pent up need for this. There has to be a resolution for the free versus pay debate otherwise we cannot afford to pay for things like news bureaus in Kabul,” Miller said.

In a way, it’s a big experiment, and since Murdoch (thinks he) can afford it, the rest of the world can simply stay on the sidelines and watch what happens. My guess is that News Corp’s properties will simply get less traffic, and in turn sell fewer ads and earn less money, creating losses that money from subscriptions won’t cover.

I just cannot imagine a world in which all of the big creators of news create a paywall around their content and then everyone decides that this content is so great they should pay for it instead of finding it elsewhere for free. The landscape of the news industry and the way news is disseminated has changed so much that sticking to the old business model and completely shutting off new possibilities won’t work. I might be wrong, though; I guess we’ll know soon enough.

Source:

http://mashable.com/2009/11/13/murdoch-news-corp-google/

Web Designers Chicago

Friday, November 13, 2009

Microsoft Developer Technologies Take Center Stage at PDC

With a full schedule on tap, the 2009 Microsoft PDC (Professional Developers Conference) next week will tout efforts ranging from the company's Windows Azure cloud platform to plans for programming languages.

Session descriptions for the Los Angeles event reveal topics such as "Architecting and Developing for Windows Azure," "Future Directions for C# and Visual Basic" and "Software + Services Identity Roadmap Update."

Introduced at last year's PDC, Azure remains a hot topic for the company, based on this year's PDC schedule. One session, entitled "Bridging the Gap from On-Premises to the Cloud," invites attendees to hear " how Microsoft views the future of cloud computing and how it is starting to deliver this vision in the Windows Azure platform."

"Learn how applications can be written to preserve much of the investment in code, programming models, and tools, yet adapt to the scale-out, distributed, and virtualized environment of the cloud," the session description reads. Migrating applications to Azure also will be covered at PDC.

In addition to hearing C# and Visual Basic plans, attendees interested in programming languages can learn about Axum, which provides a .Net language for "safe and scalable" concurrency, the PDC Web site states.

Axum is a project from Microsoft's Parallel Computing Platform. "It's a language that builds on the principles of isolation, agents, and message-passing to increase application safety, responsiveness, scalability and developer productivity," according to the PDC site.

A roadmap for the Silverlight rich Internet application platform will be detailed at PDC. Microsoft's M data and modeling language will be covered as well. "Explore the future of M, where DSL, schema, and lots of other great ideas come together as a single Web-centric data processing language," the PDC Web site states. XAML futures for .Net Framework, Silverlight, and tools will be covered in a separate session.

The Microsoft "Velocity" project, featuring distributed in-memory caching, is to be detailed at PDC. Velocity "will change how you think about scaling your Microsoft .Net-connected applications," a session description reads. Also on the roster of sessions is one entitled, "InferNet: Building Software with Intelligence." Infer.Net is a machine-learning framework for building .Net software that can adapt to a user, learn from examples or work with uncertain information.

Also at PDC, Microsoft will discuss its SQL Server Modeling technology, formerly known as the Oslo modeling platform, Microsoft's Douglas Purdy, a software architect, revealed this week in a blog. But the transformation of Oslo has not sat well with some Microsoft observers.

"Oslo seems to have gone from a potential new enterprise architect modeling platform to just a modeling tool and DSL stack for SQL Server. This ignores all those large enterprise IT departments with heterogeneous data platforms," said one person commenting in the blog.

A community technology preview featuring SQL Server Modeling technologies is set for release at PDC.


Source:

http://www.infoworld.com/d/developer-world/microsoft-developer-technologies-take-center-stage-pdc-292

Chicago Web Design Firms

Wednesday, November 11, 2009

Microsoft, IBM And Yahoo Are Vying To Take Part In India’s Unique ID Project

It appears that both Yahoo and Microsoft are duking it out to help power the technology for India’s Unique Identification project. Spearheaded by Indian tech czar and Infosys co-chairman Nanden Nilekani, the project aims to assign every Indian citizen with a unique identification number that will identify him or her, similar to a U.S. social security number.

This is no small task considering India’s population of 1.2 billion citizens. It will involve a powerful technology to assign the numbers and a vast database to organize each unique ID. That’s where Microsoft and Yahoo come in.

Earlier this year, Microsoft chairman Bill Gates expressed a strong interest in participating in the project, meeting Nilekani and assuring him that Microsoft would be able to assign the IDs swiftly.

This week Yahoo CEO Carol Bartz lobbied India’s Prime Minister Manmohan Singh to use Yahoo for the project, but Bartz says that there’s no commercial interest in the deal and Yahoo would help power the project on a non-profit basis. Bartz added that Yahoo would be the optimal choice because Yahoo has a major presence in India. The company claims that three out of four Indians access the Internet through Yahoo.

While Yahoo is vastly popular in India thanks to sites like Yahoo Cricket that appeal to the population, its hold may be slipping. Gmail recently overtook Yahoo Mail as the most trafficked email site and Yahoo was forced to shut down its Indian social network SpotM a few months ago, as Google’s Orkut and Facebook emerge as the dominant social networks in India.

It’s unclear if Microsoft has the same “non-profit” stance as Yahoo, but obviously both companies want a piece of a highly ambitious project that could be implemented in other emerging countries. And it looks like IBM is also throwing its hat into the ring as well, so it should be interesting to see which tech giant wins out.

Source:

http://www.techcrunch.com/2009/11/11/microsoft-and-yahoo-are-battling-to-take-part-in-indias-unique-id-project/

Web Design Firms Chicago

Motorola's Droid May Be a Boon for IPhone Owners

In a nod to my colleague Tom Wailgum, who wrote What Star Wars Teaches Us About Career Management, I'm waving a hand in front of mobile users and saying, "This is the Droid you're looking for"—and IPhone owners should be wishing for the same thing.

I'm talking, of course, about the Motorola Droid introduced by Verizon on Wednesday. The very cool smartphone hits Verizon Wireless stores' shelves on Nov. 6 and will cost $200 after a $100 rebate, along with a nationwide calling plan and a $30 data plan.

Let's be clear: The Droid is the first big contender to cross swords with the almighty iPhone. And it's got powerful allies like Google and Verizon, as well as Motorola on the comeback trail. The Droid boasts a beautiful high-res screen, multi-tasking apps, free turn-by-turn GPS navigational system, and both a touch keypad and hardware keypad.

In comparison, the iPhone's screen isn't as sharp. The iPhone also doesn't support multi-tasking apps (at least, not widely). While Tom-Tom offers an iPhone app for turn-by-turn GPS, the app costs a whopping $100. And, of course, the iPhone has only a touch keypad.

The Droid, though, may be a boon for iPhone owners thanks to the benefits that spring from increased competition. Mobile app developers, for instance, will no longer be able to corner the smartphone market with a single iPhone app. They will need to develop on multiple platforms and compete with a whole new crop of smart developers.

Increased competition will also put the screws to Apple. In the past, Apple could dictate to the market what works and what doesn't work, what is possible and impossible. Now the market gets to dictate. "If Droid has background processing and still gives good battery life, this could put some pressure on Apple to rethink background processing," says Gartner analyst Van Baker.

But competition only goes so far—that is, Apple doesn't respond to competitive pressures very often. It's unlikely, for instance, that Apple will abandon its imperialistic ways concerning the App Store and Adobe Flash. (The iPhone does not support Flash, whereas Droid does.)

It's also unlikely that iPhone owners will flock to the Droid and Verizon's 3G network, thus taking some pressure off of AT&T's network. "Maybe a few iPhone 1.0 users whose contracts have expired" will move to the Droid, Baker says, "but not too many."

It's more likely that Droid owners will tap the Internet as much as iPhone users do, which is to say a lot. And Verizon could end up having network and customer service fallouts similar to those of AT&T.

Source:

http://www.pcworld.com/article/181113/motorolas_droid_may_be_a_boon_for_iphone_owners.html

Chicago Website Designers

Tuesday, November 10, 2009

Oracle's Sun Deal: Oracle May Need to Loosen Its Grip



When Oracle announced its $7.4 billion acquisition of Sun Microsystems in April, the software behemoth was acting on a grand vision. The deal was part of Oracle's aim to become a soup-to-nuts supplier of everything companies need to run their computer systems, from chips and operating systems to databases and business programs. The grand plan may need some revision.

In order to alleviate pressure from European Union regulators worried about Oracle's (ORCL) growing power, Oracle may be forced to give up some control of a key aspect of the deal: the open-source MySQL database software owned by Sun.

The EU on Nov. 9 formally objected to Oracle’s acquisition of Sun. A Sun regulatory filing said the EU believes Oracle’s ownership of MySQL would have “potential negative effects on competition” in the $19 billion-a-year database market. SUnN (JAVA) makes computer systems and software including the Java programming language and MySQL database, a kind of electronic filing system.

"Taking a Tough Stance"

MySQL, available free of charge, runs the Web sites of some of the Internet's biggest brands. Among them: Twitter, Facebook, Google (GOOG), and Yahoo (YHOO). "They're taking a tough stance because 10 years down the road this could be a pretty big competitor to Oracle," says a securities analyst who asked not to be named because he was expressing personal views on the deal.

After saying in September that it's looking into aspects of the deal, Europe's more formal objection to Oracle's acquisition of Sun sets the stage for negotiations on how to make the deal pass muster. Regulators may ask Oracle to release a new version of MySQL that it doesn't control to preserve competition. Sun bought MySQL for $1 billion in 2008.

Oracle declined to comment for this article, but issued a statement Nov. 9 that said the EU’s objection "reveals of profound misunderstanding of both database competition and open-source dynamics…Because MySQL is open source, it cannot be controlled by anyone," Oracle said.

MySQL is distributed under an open-source licensing agreement, which lets users freely modify its code, companies including Google, Amazon.com (AMZN), and a software development project called Drizzle that's staffed partly with Sun employees, have already modified the database or incorporated it into commercial products without buying an officially supported version from Sun. For example, Amazon.com on Oct. 27 announced that customers can rent the MySQL database from Amazon over the Internet, paying by how much data they store and transfer. Google maintains its own version of MySQL, too.

The EU Is Positioning Itself

The presence of these versions in the wild suggests that forcing Oracle to spin out yet another version of the software may be redundant. "The remedy's already there," says one industry executive familiar with the EU's thinking, who says a regulated new version of the software would have little impact on the way companies license and use MySQL, which is prized for its speed and adaptability to running large Web sites. "The vast majority of the installed base isn't controlled by the vendor…I'm at a loss why any other remedies would be needed," this person says. "There's something illogical in the whole thing."

Whatever the proposed concessions, even Oracle's competitors believe the Sun deal will get done. "They're doing what the EU always does—making provocative statements," says an executive at an Oracle competitor. Analysts say the EU wants to position itself as an advocate for technologies that are more open rather than proprietary—closely guarded by license owners.

Source:

http://www.businessweek.com/technology/content/nov2009/tc2009118_559520.htm

Chicago Web Site Design Company

Monday, November 9, 2009

Google's Desire to Scan Old Books has Critics Casting it as Goliath



Google's ambitious plan to scan millions of old, out-of-print books, many of them forgotten in musty university libraries, has turned into one of the biggest controversies in the young company's history.

A broad array of opponents, ranging from Google competitors Microsoft and Amazon to libraries and copyright scholars, has joined forces to oppose Google's proposal to create a comprehensive online repository of the books and split the revenue from access to that catalog with authors and publishers.

Facing a November deadline to revamp the proposal, which Google struck with the book-publishing industry after a class-action lawsuit, the company with the unofficial motto "Don't Be Evil" is fighting the perception among some that the plan is an unseemly power play to seize the lucrative dominance of digital books.

Company co-founder Sergey Brin has said repeatedly in recent weeks that Google is primarily acting with the public good in mind to preserve the world's cultural heritage in old books.

"I've been surprised at the level of controversy there," Brin said at a recent Internet conference in San Francisco. "Because digitalizing the world's books to make them available, there's been nobody else who's attempted it at our scale."

Federal regulators didn't see it that way, with the U.S. Department of Justice asking a federal judge this fall to reject the proposed class-action settlement. Department officials say Google's plans would potentially violate federal antitrust laws.

The issue has clearly become more prominent because of Google's vast ambitions -- its dominant search index to more than a trillion of web pages, and its march into digital maps, mobile phones, online video and other sectors of the Internet.

The controversy has shown Google is not that different from other profit-driven corporations, said Siva Vaidhyanathan, a professor of media studies and law at the University of Virginia.

"It really took something this big and grand to show that Google does have problems, and does have vulnerabilities, and can be exploitative," Vaidhyanathan said. "I'm as surprised as anybody that this turned out to be the moment in which Google's true nature came to light."

Source:

http://www.physorg.com/news176738669.html

Web Design Firms Chicago

Friday, November 6, 2009

Facebook Poll Launches Secret Service Investigation


If you are a developer, you better watch the content you post on Facebook or else you could be on the receiving end of a Secret Service investigation. This has occurred after a third party application was pulled that allowed users to create Facebook polls. A recent poll that was placed on Facebook posing the following question: “Should President Obama be killed?”

The answers to the poll were either “yes,” “maybe,” “if he cuts my healthcare” and “no.” The Secret Service agency spokesman James Mackin has said an investigation has been launched because of the poll. Facebook users reported the poll to the site owners and it was quickly taken down. The application has been suspended until the developer can prove that they can better filter the content generated by the poll creator. Facebook handles these situations by asking users to police the site and flag any inappropriate content.

It appears that the application was created over the weekend and first appeared on the developer’s page Advanced Alien Technology. The user that created the poll has yet to be named.

Source:

http://www.gadgetell.com/tech/comment/facebook-poll-launches-secret-service-investigation/

Custom Web Design Chicago

Wednesday, November 4, 2009

Six Inconvenient Truths about SEO

You've probably seen them: programs claiming to teach you how you can use SEO to boost your Google rankings and in turn build a successful internet business that runs on cruise control. All for the low price of $49.95.

While such programs almost always fall into the 'scam' category, there is truth to the notion that SEO can be a pathway to success. If you run any sort of website, chances are you need traffic, and SEO can deliver it. But there are some inconvenient truths about SEO that often get ignored, especially in 'newbie' circles. Here are six of them.

SEO takes time. Everyone loves the idea of instant rewards. Unfortunately, it's hard to find instant rewards in the world of SEO. In my opinion, any commitment of less than six months isn't a commitment at all because in competitive markets it's my experience that you shouldn't expect to see results that drive meaningful returns before then.

SEO is not fair. Paid links work and your competitors might be winning by using them in flagrant disregard of Google's 'rules'. But that's life in the world of SEO. Just as in life generally, some people break the rules, get away with it and prosper as a result. You can fret about this or focus on your own efforts.

Top SERPs aren't always possible. Hard work and patience can go a long way, but let's be honest: everybody in your market wants top SERPs for the valuable keywords. You all can't have them. Bottom line: there's only one top spot and only one person can get it.

SEO is not all about process. The mechanics of kicking a soccer ball are easy enough to understand. But chances are you're not Ronaldo. The same is sort of true of SEO. While you can learn the mechanics, from page structure to link building, there's something beyond the mechanics. The truth is that there are naturals who just have a natural talent for putting it all together and getting to the top of the SERPs.

SEO isn't a standalone marketing strategy. I cringe whenever somebody tells me that his marketing strategy is "viral marketing". Yet "SEO" isn't much better. Sure there are some people who use nothing more than SEO and whose websites bring in big bucks, but for the average person just starting out, SEO should be looked at as a potentially valuable part of a more comprehensive marketing strategy. That doesn't mean that, in the end, SEO can't represent more than 50% of where you spend your time and money, but depending on the nature of your business, paid search, display, email and offline techniques can all play a big role too and are worthy of consideration.

The last mile counts more than the first 99. The process of getting great SERPs can be a long, strenuous journey. But getting great SERPs isn't the end of this hundred miler. Success is based on what you do with those SERPs. So if you're in the business of generating leads, for instance, all the organic search traffic in the world won't do you an ounce of good if you're not converting, or converting but still leaving lots of money on the table. Bottom line: SEO done right delivers intent; the job of your website is to convince the user to take action.

Source:

http://econsultancy.com/blog/4882-six-inconvenient-truths-about-seo

Chicago Web Site Design Company

Tuesday, November 3, 2009

Google Chrome Beta Gets Bookmark Sync

The test version of Google's browser will let you save your favorite sites, and access them from multiple computers

The latest beta version of Google's Chrome Web Browser is making it easier for you to keep track of all your favorite Web sites across multiple computers. The search giant introduced bookmark syncing this week as a feature of Chrome's latest trial version. Google started testing bookmark syncing earlier this year on developer builds of Chrome, and its release on the beta channel means bookmark syncing is one step closer to becoming a standard feature of Chrome's stable version.

Bookmark SYNC

Once you've downloaded the Chrome beta, you can access the new feature by clicking on the wrench icon on the far right side of your browser window. Then select "Synchronize my bookmarks," and a pop-up window should appear asking you for your Google Account information. Sign in, and Chrome will store your bookmarks in your Google Docs account. To sync your bookmarks across multiple locations, just download the beta version of Chrome on each computer you use, and repeat the steps outlined above.

When you add, delete, or edit your Chrome bookmarks on any device, those changes will be updated across all your computers. You can also add bookmarks from other Web browsers like Firefox and Internet Explorer by importing the data into Chrome, and the new additions will be automatically synced with the file in your Google Docs account. Google does not allow you to edit bookmarks directly from Google Docs.

Other Alternatives

If Google Chrome is not your thing, but you like the concept of bookmark syncing, you can also get the same functionality on other popular browsers. Internet Explorer users can download the Windows Live Toolbar to store and sync bookmarks with Microsoft's online storage service,

SkyDrive.

Firefox users can download the Xmark add-on that allows you to synchronize your bookmarks and passwords. And Opera users can get in on the syncing action through Opera Link, which stores bookmarks, speed dial entries and more. You can also access your Opera Link data within competing browsers at link.opera.com. If you don't want to be tied down to a specific browser brand, try out Delicious, the social online bookmarking site.

Speed

Google says the latest developer build of Chrome is thirty percent faster than the browser's current stable version. Chrome's new speed claims come on the heels of similar statements from Mozilla, which released the beta version of Firefox 3.6 on Friday. Despite the media attention heaped on Chrome since its initial release last year, the browser is still far behind in popularity compared to the two market leaders: Microsoft's Internet Explorer and Mozilla Firefox.


Source:

http://www.infoworld.com/d/applications/google-chrome-beta-gets-bookmark-sync-746

Chicago Website Design

Monday, November 2, 2009

Windows 7 Share Surges 40% in First Week of Release

Overall, the Windows OS continued to lose share globally while Apple's Mac OS X picked up most of that loss, according to Net Applications

Windows 7's market share surged nearly 40 percent in the week following its release, according to Web measurement company Net Applications.

Overall, Windows continued to lose share globally, dropping 0.23 of a percentage point during October, while Apple's Mac OS X picked up most of that loss, gaining 0.15 of a point to finish the month near 5.3 percent, its highest ever.

For the week after Microsoft launched Windows 7 on Oct. 22, the new operating system's share averaged 2.66 percent, a jump of more than 39 percent over the 1.91 percent average for the part of October prior to its retail release.

Windows 7's peak of 3.48 percent on Saturday, Oct. 31, represented an even larger 82 percent increase over the average of Oct. 1 through Oct. 22. For the month, Windows 7 finished with a market share of 2.15 percent, up 41 percent over the 1.52 percent for September. The numbers from Net Applications mean that about one in every 44 personal computers was running Windows 7 last month.

But some countries boast a much higher Windows 7 share, Net Applications said. "Upon analysis of global Windows 7 usage share, we noticed a distinct pattern," the company said in a note posted on its site. "Russia and many Eastern European countries already have significant share of Windows 7 usage. We are sure these are all properly licensed users."

The tongue-in-cheek comment was well taken: Of the top 25 countries by Windows 7 usage, 17 are in Eastern Europe or formerly part of the U.S.S.R. Slovenia, where 7.8 percent of the computers ran Windows 7 last month, led the list, followed by Lithuania in the No. 3 spot (6.5 percent), Rumania as No 4 (6 percent) and Latvia at No. 5 (6 percent). In Russia, at No. 21, 4.2 percent of all machines used Windows 7.

Net Applications' implication -- that the Windows 7 share in Eastern Europe is due to counterfeit copies -- is backed up by data from a May 2009 report generated by the Business Software Alliance (BSA), an industry-backed anti-piracy organization, and research firm IDC. In 2008, the piracy rate in Central and Eastern Europe was the highest of all seven regions the BSA and IDC tracked.

Slovenia's piracy rate -- the estimated percentage of all software in use that is not legally licensed -- was 47 percent last year, more than double the rate in the U.S. Lithuania, Rumania, Latvia and Russia, meanwhile, had piracy rates of 54 percent, 66 percent, 56 percent and 68 percent, respectively.

However Windows 7 was acquired -- legally or not -- its increase was outweighed by a steeper-than-usual decline in Windows XP last month. The eight-year-old operating system lost 0.92 percentage point in October, significantly more than the 0.64 point average loss each month during the past year, falling to 70.6 percent.

Vista rebounded from September, when it fell for the first time in more than two years. Vista's October share of 18.77 percent, however, was still off its record of 18.8 percent in August.

Windows' overall share dropped 0.23 of a percentage point to 92.5 percent. Microsoft's OS has lost about two and a half share points in the last year.

As it has repeatedly, Mac OS X was the recipient of most of the users lost to Microsoft: Apple's operating system climbed by 0.15 of a percentage point to end October at 5.27 percent, a new record and only the second time it's finished above 5 percent since Net Applications revised its methodology in June.

Net Applications Measures Operating System usage by tracking the machines that surf to the 40,000 sites it monitors for clients, which results in a pool of about 160 million unique visitors per month. It weights share by the estimated size of each country's Internet population.

Source:

http://www.infoworld.com/d/windows/windows-7-share-surges-40-in-first-week-release-471

Chicago Website Design

Sunday, November 1, 2009

Is Google Trying to Kill us by Screen-Only Navigation?

With all the hype around Google’s big announcement yesterday that they would be providing Android devices with free turn-by-turn directions, you’d think stand alone GPS companies should call it a day and pack up the office. But we are missing something: nowhere does Google say anything about spoken turn-by-turn directions.

Turn left in 1.3 miles

One of the best features in a GPS Navigation System, in my book at least, is audible instructions so I don’t have to take my eyes off the road. TeleNav, which I use, even speaks the road names I should be turning on to. How can a GPS navigation system be viable without spoken instructions? Isn’t this traffic safety week or something?

Rerouting?

I’ve been thinking since the announcement that I must have overlooked it. I twittered my question to bloggers and got no responses. I combed Google’s Navigation site and watched videos. The closest I came to any kind of confirmation was a mute button. Not very convincing, right?

There is nothing better than connected GPS (IMO) and Google brings this. I really dig the street view, another great thing Google brings. But without spoken navigation instructions, what’s the point? It will be just a step above the Google Maps route info on the iPhone. At best, we can hope to get pulled over by all these no texting laws for this, at worst we take our eyes off the road and miss who knows what. Surely Google isn’t trying to kill us, right?

I’ll continue searching. If you find something on this, post it in the comments. We’ll get to the bottom of this, hopefully before the phone launches.

Source:

http://www.gadgetell.com/tech/comment/is-google-trying-to-kill-us-by-screen-only-navigation/

Chicago Website Design