Science News of 2007
The Technology eZine - Year in Review


This Website is Best Viewed Using Firefox

Science Buzz 2007

We still haven't finished exploring our oceans. That simple idea, started a decade ago, has now grown into the Census of Marine Life, the most ambitious scientific collaboration the world has ever seen. More than 2,000 researchers from 80 countries are engaged in 17 pioneering projects to catalogue the largely unknown biological diversity of the oceanic 70 per cent of the planet. And this year – the seventh of a planned 10 – the census unequivocally reached the status of a seminal idea in the history of science, as it became clear it would produce a representative picture of marine life by the 2010 finish line.

Roughly 300,000 "macro" organisms – things bigger than microbes – are already linked to oceans and the census expects to catalogue a million species in total by 2010. Most will not be new to science but will be rigorously identified for the first time, often by a bar-coding technique pioneered at the University of Guelph. About 5,300 previously unknown species of plants, animals and fungi have been identified already. Yet census senior scientist Ron O'Dor estimates that the project will have covered no more than a quarter of the world's oceans by volume when it ends. Already planning is underway for another such effort to culminate in 2020.

1. Personal Genome

For the first time, scientists have sequenced the genome of a single individual – a step they call a huge leap forward in the quest for personalized medicine based on each of our genetic codes.

Craig Venter, whose former company Celera Genomics did much of the groundbreaking work in the original mapping of the human genome from a composite of several individuals' DNA, becomes the first person to have his full genetic code on the Internet for anyone to see. Venter's genome, which includes genes inherited from both his parents, was published online in PloS Biology, providing researchers with a rich reference source for comparing DNA sequences from other humans. This was the year the worm turned in human genetics. Until now, the mantra had been that there was astonishingly little difference between individuals in the genetic blueprint, or genome. And also not as much genetic difference as expected between humans and some of the great apes.

It turns out the mantra was wrong. A rash of studies published in 2007 demonstrated just how much DNA can differ between individuals.

The studies rely on charting minute variations in genomes called single-nucleotide polymorphisms (SNPs, pronounced "snips"). By mid-2007, more than 3 million such locations had been identified.

But did these differences matter? In a dozen studies this past year, researchers compared the DNA of people with complex afflictions, such as diabetes, hypertension and rheumatoid arthritis, to the DNA of people free of those afflictions. These "genome-wide association studies" linked increased risk for these afflictions with variants of more than 50 genes.

The effects from any one variant may well be small and could also be outweighed by environmental factors and other genetic conditions. But establishing this preliminary link was enough for the journal Science to select human genetic variation as the breakthrough of the year. The journal's account can be read at sciencemag.org/sciext/btoy2007.

2. Cancer marker found

An Ontario-led team of scientists helped find the first genetic predictor for colorectal cancer, the second deadliest form of cancer in the country. Experts said the finding, announced in July, will lead to huge advances in screening and prevention methods. After digging through more than 100,000 pieces of genetic material from 15,000 people, the team found a specific site on Chromosome 8 associated with colorectal cancer. Research teams from Britain and the United States also found the same site on the same chromosome.

"Having the site increases a person's risk of getting colorectal cancer by about 20 per cent," said Tom Hudson, president of the Ontario Institute of Cancer Research and co-leader of the study. The finding, he said, will make it easier for people who have a family history of colorectal cancer to predict the likelihood of their getting the disease. Catching the disease early is imperative to try to halt the cancer.

3. Elderly Alzheimer's Gene

University of Toronto scientists led an international team that announced the discovery this year of a gene responsible for many cases of the most common form of Alzheimer's disease. A defective form of the gene, known as SORL 1, could doom about 60 per cent of the people who possess it to late onset Alzheimer's and appears to cross many ethnic and racial lines. While it's too early to say how common the mutant gene may be in the general population, U of T researchers say it might create a two-fold increase in the risks people will develop the neurological ailment late in life. The discovery was published in the February issue of the journal Nature Genetics. About 90 per cent of all Alzheimer's cases are of the late onset variety, which typically affects people 65 and older.

4. `Cosmic bullet' solved

In November, an international collaboration of almost 400 scientists from 17 countries announced that the 1,600 specialized detectors at the Pierre Auger Observatory in Argentina had succeeded in solving the mystery of "cosmic bullets" – rare atomic particles that slam into the atmosphere with ultra high energy. The "cosmic bullets" almost certainly originate from turbulent black holes that lie at the centre of hyperactive galaxies. These particles are the high-energy end of what are commonly called cosmic rays. Positively linking the cosmic bullets to specific sources means that astronomers can now use them as new "eyes" to study aspects of the universe that can't be viewed using visible light, infrared, ultraviolet, X-rays or other existing tools.

5. Insights into dawn of life

Every year, the Royal Ontario Museum's experts mount an all-day public colloquium to showcase their most interesting research. The session often features new discoveries from the ROM's 150,000 fossil specimens gathered at the Burgess Shale, a B.C. site where special conditions captured the dawn of life on Earth a half billion years ago.

Jean-Bernard Caron, the assistant curator of invertebrate paleontology, had one such discovery this year. But he could only drop hints at the colloquium because of the timing of a scientific paper. Just weeks later, Caron and collaborator Simon Conway Morris announced that they'd found a new early animal species preserved in nine Burgess Shale fossils.

Covered with a prominent shell plus armoured plates, these halwaxiids unite two previously mysterious groups of primitive slug-like animals. The fossils also suggest that molluscs may have emerged earlier than previously thought, information that could rewrite the timeline of evolutionary biology.

6. Light shed on Dark energy

Astronomers use the brightness of a special kind of supernova as a "standard candle" for estimating the speed at which the universe is spreading outward. These calculations are used to tease out the effects of dark energy, a mysterious force thought to account for three-quarters of the content of the universe.

Dark energy is also fingered as the reason the universe is expanding much faster than it should be, based on observable energy sources. Now it turns out that those candles – the explosions of white dwarf stars (see photo, left) – aren't as reliable gauges of distance as scientists thought, according to an international study led by University of Toronto researcher Andrew Howell.

By examining data from two large-scale surveys, the researchers concluded that supernova were on average 12 per cent brighter 8 billion years ago than they are now. This variation could seriously hamper attempts to understand dark energy, one of the universe's biggest puzzles.

7. Mediocrity rules

People are so convinced they can't do well in certain pursuits that they can inadvertently sabotage their own performance to achieve their low expectations.

University of Toronto social psychologist Jason Plaks says this self-imposed mediocrity in selected areas is so pervasive that people become anxious if they achieve above their expectations. The anxiety of overachieving in areas they've convinced themselves they're no good at may actually limit some people's ability to do well in them, Plaks says.

8. Genetic link found for Autism

After years of searching, an international team of scientists homed in on the genetic underpinnings of autism last February. The study, co-led by Toronto's Stephen Scherer, left, a senior scientist at the Hospital for Sick Children, was heralded as a major breakthrough in autism research and could lead to better ways of diagnosing autism spectrum disorder. It is estimated the disorder affects an estimated one in 165 children. The five-year collaboration, part of the Autism Genome Project, involved more than 130 scientists in 50 institutions in 19 countries, and cost $20 million.

The scientists searched for autism-susceptibility genes by sifting through the DNA of 1,600 families with at least two members with autism. The genome-wide scans led scientists to a previously unidentified area on chromosome 11, which they now believe harbours genes that increase the risk for autism. Previous studies have suggested between eight and 20 different genes are linked to autism, but Scherer now estimates 100 genes could be involved.

9. Goodbye to plonk?

For centuries, the vineyards of France have been the birthplace to some of the most distinct and sought-after grapes in the world. Wine producers from Australia to California and Niagara to South Africa have all tried to emulate the tastes of native-French grapes.

But their attempts at emulation might soon evolve into genetic replication. Last year a team of French and Italian researchers mapped the genome of one of the most respected French grapes, the pinot noir, used to make red wine and bubbly in France's Burgundy region.

In cracking the genetic code of the pinot grape, these scientists have isolated 30,000 genes in its DNA and now know precisely what makes a French pinot grape taste the way it does.

"Pinot-based wines produced in say Burgundy, while similar, are still distinctly different from those produced in California, Oregon or New Zealand," Allen Meadows, a leading Burgundy critic told reporters earlier in the year.

There are several ways a grape's genome might be put to use: cross breeding of genomes to produce a new grape, strengthening the genetic makeup of a mediocre vine's fruit, to name but two. But perhaps the most intriguing outcome could be the synthetic replication of the finest pinot noir grape in the world.

If the genetic makeup of a grape is the recipe for a fine wine, then it's possible that future bottles of red might have the same generic flavouring, as is the case with Coca-Cola, formulaically brewed to taste identical every single time. But if growers of lesser-respected grapes replicate the taste of the true pinot noir, an otherwise cheap bottle of sludge from a newly planted Okanagan grapevine might soon be made to taste precisely the same as an authentic bottle of 1994 French bubbly.

10. Play golf, stay young

A study of veteran professional golfers has found that the game may help keep you young – as long as you keep teeing it up.

Analyzing results in various categories of the game for 96 professionals who had played on the PGA tour for at least 12 years, researchers from York University and Queen's University found that while more strength-related aspects of the game, such as driving distance, declined over time, the mind-focused ones, including putting and driving accuracy, actually got better.

The study shows "that cognitive, perceptual and motor skills are really resistant to decline, but you've got to stay involved," said lead author Joe Baker, a professor of kinesiology and health science at York.

Golf offers a great deal both physically and mentally, he said, especially if players walk the course rather than take a beer-laden cart.

Baker noted that even though the study focused on elite athletes it suggests duffers can maintain cognitive skills at good levels if they practice often enough.


Computer Tech Buzz 2007

1. Java is becoming the new Cobol

Java, the oldest new programming language around, is falling out of favor with developers. When it comes to developing the increasingly common rich Internet applications, Java is losing ground to Ruby on Rails, PHP, AJAX and other cool new languages. And there are even reports that Microsoft’s .Net, of all things, is pushing Java out of the enterprise. Makes you wonder whether Sun was smart to change its stock-ticker code to JAVA last summer.

Simply put, developers are saying that Java slows them down. “There were big promises that Java would solve incompatibility problems [across platforms]. But now there are different versions and different downloads, creating complications,” says Peter Thoneny, CEO of Twiki.net, which produces a certified version of the open source Twiki wiki-platform software. “It has not gotten easier. It’s more complicated,” concurs Ofer Ronen, CEO of Sendori, which routes domain traffic to online advertisers and ad networks. Sendori has moved to Ruby on Rails. Ronen says Ruby offers pre-built structures — say, a shopping cart for an e-commerce site — that you’d have to code from the ground up using Java.

Another area of weakness is the development of mobile applications. Java’s UI capabilities and its memory footprint simply don’t measure up, says Samir Shah, CEO of software testing provider Zephyr. No wonder the mobile edition of Java has all but disappeared, and no wonder Google is creating its own version (Android).

These weaknesses are having a real effect. Late last month, Info-Tech Research Group said its survey of 1,850 businesses found .Net the choice over Java among businesses of all sizes and industries, thanks to its promotion via Visual Studio and SharePoint. Microsoft is driving uptake of the .Net platform at the expense of Java," says George Goodall, a senior research analyst at Info-Tech.

One bit of good news: developers and analysts agree that Java is alive and well for internally developed enterprise apps. “On the back end, there is still a substantial amount of infrastructure available that makes Java a very strong contender,” says Zephyr’s Shah.

Now that Java is no longer the unchallenged champ for Internet-delivered apps, it makes sense for companies to find programmers who are skilled in the new languages. If you’re a Java developer, now’s the time to invest in new skills.

2. Sun Microsystems is back in the game

Sun Microsystems has been in and out of the (metaphorical) grave more often than Count Dracula. The one-time king of the Internet servers suffered a body blow when the dot-com bubble burst, and since then it’s been a struggle to keep the company’s name on the buy lists of enterprise IT shoppers. That struggle has been chronicled by endless stories in the trade and financial press with headlines featuring bad puns like “The Sun is setting.”

Those days are finally, if quietly, coming to a close.

In less than two years in the CEO’s chair, Jonathan Schwartz has put the company firmly in the black and, after years of bleeding, resuscitated Sun’s flagging software business and storage businesses, and put together a product road map that Morgan Stanley analyst Katherine Hubert calls its strongest in years.

“You have to credit Schwartz because he understands the way to combine hardware and software,” says Bud Mathaisel, CIO of outsourced-IT provider Achievo. “Although Sun’s image slipped from public view, they were making very competitive technology and pricing.”

Sun’s line of Galaxy and blade servers are hits across a wide swatch of businesses, and its storage business is growing. Sun has already entered the market for quad-core servers (while rival AMD’s Barcelona remains problem-plagued), and in the next year or so Sun will roll out new server, storage, and networking products. Also, Sun acquired SeeBeyond in August 2007, a $387 million buy that positions Sun to become a vendor of choice in the race to provide the integration tools needed to support enterprises’ SOA strategies.

The powerful new hardware lineup, plus the enterprise service bus that came with SeeBeyond once again makes Sun core to the datacenter.

Not all of the credit goes to Schwartz, of course. His predecessor, cofounder Scott McNealy, poured hundreds of millions into R&D in the dark days of 2002 and created the UltraSparc T1 (or Niagara), the server chip that’s keeping Sun competitive with larger rivals IBM, HP, and Dell.

What’s more, McNealy had the foresight to bring back hardware wizard Andy Bechtolsheim in 2004. Bechtolsheim led Sun efforts to design the AMD Opteron-based Sun Fire x64 servers — better known by their code name “Galaxy” — that were key to Sun’s datacenter reentry.

Sun’s transition from leader to laggard and back to contender has been very painful, particularly because of the deeps cuts in personnel Sun has been forced to make. But the numbers — four profitable quarters in a row plus record margins — as well as a growing list of wins, point to a company on the upswing. Sun may not be at the top of everybody’s vendor list, but once again, it’s worth your consideration.

3. Hackers take aim at Mac OS X

It’s not often that an analyst covering computer security issues tells you that he doesn’t do much to protect his systems. But one reputable analyst I know said just that as we talked about the rising threat of malware aimed at Apple’s hardware. I won’t mention his name, but the gentleman is dead wrong. The days when you can assume that Apple’s products are exempt from harm are over.

Is it time to panic? No, actual attacks against Macs and the rest of the Apple family, such as the iPhone, are still rare. But as the platform becomes more and more popular, hackers are gearing up to do damage. You’d better protect yourself.

“Most Mac users take security too lightly. In fact, most are quite proud of the fact that they don’t run any security at all,” says IDC analyst Chris Christiansen. “That’s an open door; at some point it will be exploited,” he says.

First some numbers: In 2006, the National Institute of Standards and Testing (NIST) tabulated 106 “vulnerabilities” in Apple’s Mac OS X. (It defines vulnerabilities as a weakness in the code that could be exploited to perform unauthorized, and generally harmful, functions by the application.) In the first six months of 2007 there were 78 vulnerabilities found in Mac OS X. Windows XP (all flavors), meanwhile, had 55 vulnerabilities in 2006 and 19 in the first six months of 2007. Vista, which wasn’t available in 2006, chalked up 19 vulnerabilities in 2007.

In a sense, Apple is a victim of its own success. Savvy hackers read the same stories and watch the same television programs as the rest of us, and so they are very aware of the burgeoning popularity of Apple’s products. Hacking Windows still provides a lot more bang per bug than attacks on Apple, but the smaller rival is a more satisfying target than ever before. And the company’s deserved reputation for building good products has probably made users overconfident.

“Apple has better commercials, but the Mac is no harder to break into than a Windows PC,” says Gartner security analyst John Pescatore. What’s more, most IT shops can automatically patch large numbers of PCs at the same time, while Macs generally have to be patched one at a time, he said.

Actual attacks on the Mac platform are still unusual. But as it becomes a juicier target, that will change. Why take a chance? Give a lot more thought to securing your Macs this year.

4. There are some threats you can worry less about

Even when there’s good news on the security front, there always seems to be bad news to balance it out. And while we think this news is important for you to know, it’s not an exception to the Good News/Bad News Syndrome.

The use of e-mail-borne executable virus attachments dropped sharply in 2007. In particular, spammers have dialed back their use of so-called image spam, which tricks filters by embedding text within an attached image.

According to IronPort Systems, outbreaks of viruses embedded in e-mail attachments totaled 860 in 2006 and 844 in 2005. But as of mid-October 2007, attacks totaled just 360. Dave Mayer, an IronPort product manager, said it’s likely that outbreaks will total 450 by the end of the year — a drop of 47 percent. It appears that evildoers are moving on: “Traditional viruses have been around for years, a long enough time to harden defenses against malicious attachments,” he said.

Earlier in the year, IronPort, Symantec, and McAfee all noted that image spam, which appeared about two years ago, is waning, now that defenses have improved. Filters have now gotten better at scanning the contents of the attachments, leading spammers to link instead to images elsewhere.

Mayer, whose company specializes in e-mail and Web security, says spammers are now placing those images on free photo-sharing sites — the ones people use to send vacation photos to friends and family — and embedding links to those images in their junk messages. These are difficult for spam filters to block because the same sites are used for legitimate photos as well.

But as the abuse of e-mail attachments has declined, other types of threats have escalated. For example, outbreaks of macro viruses, aimed at Office-type applications, have risen by 50 percent to 60 percent while URL viruses climbed by at least 250 percent this year, IronPort found.

It often seems as if the number of threats that you must be on guard against only increases, stretching your resources past the limits. But the truth is that threat profiles change over time, and some things you may be investing in are no longer as great a threat or are now handled by the tools you have in place. So, although you can’t get lax about security, you can focus more on newer threats.

5. Companies may have found a way around H-1B visa limits

Silicon Valley businesses have long argued that changes in the immigration laws are needed to ensure a continuing supply of highly skilled workers. The current limit of 65,000 under the standard H1-B visas is not enough, they say. (The quota was filled in less than a month in 2007.)

But that number obscures an important fact: The real total of visas issued to highly skilled workers is closer to 400,000 annually, according to the federal Citizenship and Immigration Services (CIS) agency. And that, say some critics, may mean that the law is being abused.

There are two reasons the number is so large.

First, the H1-B visa cap has a built-in exemption that allows an additional 20,000 workers who have graduated from U.S. universities with an advanced degree (master’s or higher) to enter every year.

Second — and the biggest reason — is the use of L-1 visas, which are granted to executives and workers with specialized skills employed by multinational companies. Because there is no cap on L-1 visas issued each year, the numbers have soared. In the last three years, an average of 315,000 L-1 visas have been issued each year.

Unlike the H-1B visas, the L-1 visas are not intended to be a springboard to possible permanent residency and a coveted green card. In essence, the L-1s are intended to allow multinational companies to rotate staff across national borders, so they can transfer foreign managers and specialists within the company to U.S. offices for a limited period of time.

But the vast number of workers admitted with L-1 visas has critics suspecting that companies — Indian outsourcing firms in particular —are using them as a back door to bring in lower-paid workers to do jobs that could be performed by Americans, rather than for the intended purposes of staff rotation.

"It's clear that foreign outsourcing firms are abusing the system, and we can't let that continue," Sen. Richard Durbin (D-Ill.) said earlier this year as Congress debated immigration reform. Here’s what upset him: According to immigration records, 14 of 20 companies whose employees were granted the most L-1 visas were offshore outsourcing firms, including Tata Consultancy, Satyam Computer Services, Wipro, and Infosys Technologies.

Tata Consultancy obtained 4,887 L-1s in fiscal 2006 — the most of any company — and 3,601 in fiscal 2007, second only to Cognizant (4,869), a U.S.-based outsourcer with a major presence in India. "I find it hard to believe that any one company has that many individuals … legitimately being transferred within a single year," Durbin said in Congress. (Tata declined to comment to InfoWorld, but a spokesman told BusinessWeek, "We're complying with the law and with the regulations. We want to work with Congress to address any issues they may have.")

Infosys, Wipro, Satyam, Tata, and Cognizant also appear on the CIS’s top 10 list of companies obtaining H-1B visas for employees in fiscal 2007, along with Microsoft, Cisco, IBM, Motorola, and Intel. Those facts further raise critics’ suspicions.

Bob Meltzer, who heads the VisaNow firm that helps applicants obtain visas, says there’s a growing belief that the L-1s are being abused today, though he can’t tell if that belief is justified. He does note that the L-1s had been abused in the early 1990s in the manner that critics suspect is happening today.

Immigration is a tough issue. And it’s even tougher when we don’t have all the facts, as is the case on employers’ claims of labor shortages requiring more foreign hires and employees’ claims of being replaced by foreign workers by companies looking to save a buck. When you’ve figured out the right balance, let the pols know what you think.

6. Open source’s new commercial strategy

What’s in a name? Back in the days of the Gingrich revolution, the nastiest label you could pin on a politician was “liberal.” And now that open source has become an essential technology, the quickest way to get a rise out of an open source executive (and to get flamed on dozens of blogs) is to say his or her company is “commercial.”

No matter what you call them, open source companies have been steadily integrating parts of the hated commercial software subscription model into their business. First it was support, and now it is software access. Say it ain’t so, Linus.

MySQL, for example, will charge users of its enterprise edition to use the code, not just for support as has been the traditional open source business model. “We are looking for ways to extend our offerings in a way that can be monetized,” bluntly says Kaj Arno, the company’s VP for community.

Chander Kant, founder of the open source storage software company Zmanda, is likewise frank about the need to drive revenue. Speaking of the open source community as a whole, he says, “We are not a charity. We need to monetize.”

Recognizing that charging to use open source software flies in the face of the original open source premise of freely sharable code open to everyone, Arno is quick to note that the paid version of Workbench has code and features not found in the free version — and that the free version “is not crippled.” Arno calls the move an experiment, and acknowledges that MySQL needs to move carefully to avoid upsetting the often quick-tempered open source community.

Is this a trend? Red Monk analyst Stephen O’Grady says it is: “Ultimately when you look at a high-profile firm like MySQL embracing it, it’s difficult to conclude anything else.”

InfoWorld blogger (and IBM employee) Savio Rodrigues notes another example. In a recent blog post, he looked at changes to JBoss since it was purchased by Red Hat Software and concluded: “The Fedora/JBoss subscription is just like the software subscription business model that commercial software vendors have been using for decades.”

Does that mean a company that adopts the usage-based pricing model isn’t really open-source? That’s an interesting point, since as Zmanda’s Kant acknowledges, “the litmus test of open source is control of the code.” Open source software delivered through appliances keeps the access — and thus control — away from users, he notes. But code binaries distributed as software remain open to their users — for now, at least.

Have MySQL and Red Hat run away from the principles of open source? Who cares. The real point is that the open source business model is evolving away from the “free” model and toward an alternative development approach for commercial software. If you adopted open source due to its cost, you may not save as much as you thought. If you adopted open source for the access to source code or the community-supported development approach that is supposed to result in better software, it’s not yet clear if the greater commercialization will enhance those aspects or drive away the true believers and make open source no different than the old-fashioned commercial offerings.

7. End-to-end Ethernet finally arrives

Some technologies always seem to be “just around the corner.” Seven years ago, there was a big flurry of interest in end-to-end Ethernet (aka metro Ethernet) as a brace of new ventures promised to cut through the complexities of wide area networking.

They didn’t deliver. One big reason: building a new network that will reach into offices and homes costs beaucoup bucks. As IDC analyst Boyd Chastant puts it, “Digging a whole in the ground never gets cheaper.”

Laying fiber is still expensive, but now telcos — including AT&T, Verizon, and Qwest — plus cable companies such as Optimum LightPath, Cox, and Time Warner that already have fiber in the ground are offering Ethernet services. LightPath, for example, boasts of 2,500 miles of fiber in the ground and says it has lit more than 2,000 buildings in its service area in the northeast.

End-to-end Ethernet offers a very affordable way to connect LANs to a wide area network. After all, every router has an Ethernet interface, you don’t need much special hardware, and the technology is very familiar. So most IT staffers will have little trouble adapting, says Burton Group analyst Jeff Young. That is, if you can get it: Pricing depends on your need for speed, and not all of the providers offer it throughout their service areas.

Raw bandwidth isn’t Ethernet’s big advantage. It’s not necessarily much faster than private lines. However, says Chastant, it is more scalable. Conventional private-line services offer speeds near the bottom (around 10Mbps) and the near top (around 10Gbps), but not much in the middle. End-to-end Ethernet solves that. And it can be tied to Internet access, bumping up access speeds appreciably.

Availability is on the upswing too, says Boyd. In the past, carriers worried that end-to-end Ethernet service would cannibalize other data services. But now that demand has reached critical mass, that concern appears to have faded. Meanwhile, providers and equipment manufacturers are ironing out incompatibles and clarifying service definitions in the Metro Ethernet Forum, Chastant says.

It looks like we’ve turned the corner and end-to-end Ethernet is right in front of us.

If your carrier doesn’t provide end-to-end Ethernet, another provider likely does. It’s worth the trouble to look: End-to-end Ethernet will give you the performance you need and because it’s a familiar technology you won’t have to retrain or replace your IT staff.

8. Blade servers arrive for the masses

Call it a trend within a trend. Blade servers have become the fastest growing segment of the server market, and for the first time accounted for more than $1 billion in sales during the third quarter of 2007. Although the vast majority of those sales were to enterprise customers, IBM and Hewlett-Packard have redoubled their efforts to push blades downmarket to small and medium businesses.

And why not? There are tens of thousands of relatively small businesses, not to mention remote offices, that already run three or more servers — in many cases with only minimal IT resources.

Both HP and IBM launched blade products aimed at the small-to-medium business market late this year. HP has the BladeSystem c3000, which can hold as many as eight blades in a single chassis, and IBM has the BladeCenter S, a six-blade chassis system with on-board storage and switch options. These may sound familiar; both companies have touted blades as a solution for the small-to-medium business market for a few years. But their newest blade servers have actually been redesigned with the needs of smaller companies in mind.

A good way to understand blades’ appeal to smaller businesses is to think about what an enterprise datacenter has that a small shop doesn’t — and how the new generation of blade servers can help fill those gaps.

First, datacenters offer a controlled environment. Heat and dirt are server killers, but a small business may well stick a server under a desk or in a closet. So the new breed of blades comes with a chassis that filters and cools the air, and reduces noise.

Second, datacenters rely on dedicated power systems, which smaller businesses just can’t afford. The “downmarket” blades from HP and IBM, however, run on standard 110-volt lines and have the same three-pronged plugs you’ll find on a kitchen appliance.

Third, datacenters have lots of room. A small business has no space for racks, but a chassis filled with six or eight blades takes up no more space than a desktop PC.

Fourth, datacenters have dedicated IT support. Small businesses can’t afford that, either. Configuring small-business blades is relatively simple, though, and most customers get them preconfigured from a reseller that specializes in their industry. Either way, the blades should be up and running fairly quickly.

For now, HP and IBM are the only major vendors offering blades tailored for smaller customers. Between them, however, they account for about 75 percent of the blade market, so that may not matter. And Dell is rumored to be contemplating a move to this market, and other vendors could follow.

Why should the big boys have all the fun? Blades designed for smaller businesses let you get some of the datacenter capabilities that just weren’t possible before, without the headaches and costs of a traditional datacenter. Why not check them out?

9. BI is dead; long live BI

With three of the largest competitors in the field taken over by software giants, you’d think that the best days of the business intelligence market were over. You’d be wrong.

Far from heralding the death of BI, the loss of Business Objects, Cognos and Hyperion to SAP, IBM and Oracle may well signal “a golden age for business intelligence,” says AMR Research analyst John Hagerty.

Sure, that’s a brave statement. But the BI market has never been as narrow as many people assumed. There are significant players left, including SAS, SPSS, MicroStrategy, Actuate, and dozens of smaller, often specialized, vendors. “Go beyond the core tools of reporting and analysis and you’ll see lots of data-mining and statistical tool vendors, as well as industry-specific developers,” says IDC analyst Dan Vesset.

The acquisitions of the big guys now creates space for those other companies to flourish, says Rob Tholemeier, a former industry analyst turned private investor. The same thing happened to the database market, he notes: “There are more database companies around now than when Informix was purchased.” BI, Tholemeier says, is likely to follow the same course. “Innovation will come from the outside [companies] as the big guys expand their scale and scope,” concurs AMR’s Hagerty. Pricing, of course, is a major concern as markets consolidate. But because the supply of BI software is still ample, given the large number of independents, buyers still have significant leverage, says Vesset — though that could change further out, he adds. Moreover, it will take the platform vendors some time to integrate their acquisitions, both in terms of code and business process. And while that process plays out, prices aren’t likely to spike.

As investor Tholmeier puts it, “Clearing out the old-timers leaves a huge vacuum for the dozens of innovative players ready to strut their stuff. The business intelligence industry is dead. Long live BI.” That change presents both a challenge for IT to understand the new BI, and the opportunity to drive BI’s benefits to a greater portion of the company.

10. Balance of power shifts to software buyers

Despite the wave of consolidation sweeping the industry, buyers have actually gained more leverage than ever when it comes to making deals with their vendors, notes a PricewaterhouseCoopers analysis. “Software buyers need to realize that the pendulum is beginning to swing in their favor and that there are an increasing number of alternatives in today’s software market,” concurs Gartner analyst William Snyder (no relation to this writer).

There are a lot of reasons for the swing, but one certainly stands out: The shift to SaaS (software as a service). The poster child for this phenomenon is Salesforce.com. "Both its development and delivery models are having a disruptive effect on the entire CRM market,” notes Forrester Research analyst Bill Band. But it’s not just Salesforce: By 2011, Snyder says, fully 25 percent of all new business software will be delivered as a service as Salesforce and other vendors push upstream from the small-business market. Already, SaaS providers make up the majority of sales for several segments of human resources applications, and is well-established for supply-chain management.

SaaS is just the first wave of services that are pushing pricing power over to buyers: The more recent move to Web services and the beginnings of SOA (service-oriented architecture) deployment are encouraging the use of modular software, which in turn cuts development costs, makes it harder for vendors to sell expensive suites, and lessens the need for expensive consulting services.

And yet another factor is upping buyer’s power, notes Gartner’s Snyder: the emergence of third-party support options that are reducing software’s TCO (total cost of ownership). “Maintenance services have been a quasi-monopoly for software vendors,” he said. “For many years, only the software vendor had rights to the source code of the critical components of the system. This made it impossible to go to an open, competitive market to obtain upgrades, services and support, which in turn made it impossible to negotiate maintenance fees.”

Interestingly another software trend — the growth of virtualization — is beginning to affect pricing for hardware, particularly servers. A report by Infiniti Research suggests that server shipments will start declining in 2008: “The server market of tomorrow will be a value game and not a volume game.” And that means better pricing for hardware, too.

Don’t let your head swell too much, but Gartner vice president Mark McDonald credits “savvy CIOs” for containing costs that were once out of control. “They’re saying, ‘I want you to implement in a way that gives me the advantage over my competition’ — and they’re not accepting the traditional vendor model, which favors homogeneity and the economics of scale.”

Chances are that you’re looking at SaaS, Web services, SOA, virtualization, open source, and other technology-delivery methods for a variety of specific reasons, without realizing the cumulative effect is to give you more pricing power. Now that you’ve got the power, use it.

About Us - Advertise - News Blog - Art History - Automotives - Canada - Entertainment - Environmental - Fashion - Feminism - Gothic - Health - Politics - Religion - Technology