by December 1, 2004 0 comments

Technologies, products, implementation issues…. Which way will they swing?

Predicting that is dicey enough, where many a stalwarts have bitten the dust. So, when you start reading this piece, do keep that in mind.

In this issue we look at one hundred areas and triy to foretell which way they will go in the years to come. Instead of looking at specific products, we have covered the technology, product and implementation areas. In fact only one product, Mozilla Firefox, makes it to our list. Again, instead of covering only those that will make a difference, we have also given pointers to
technologies and areas that are fading out.

Out of necessity, we have kept the coverage of each area brief. Almost all of them have been covered in detail in the recent past, and in cases there is an abiding interest, we have done detailed stories elsewhere in
this issue.

Do reach us at to tell whether you agree with our predictions, and together, we will watch how they actually play out.

What’s hot, what’s not

We recommend: Focus on the hot areas, assuming you have already done the steady ones

Buzz Long term Very hot
Lot of talk, no concrete action; may or may not happen

  • Application Lifecycle
  • Biological Computing
  • eXtensible Business
    Reporting Language
  • Micro Fuel Cells n Micro Payments
  • Nanotechnology
  • Personal Area Networks
  • Robotics
  • Service-Oriented Architecture
  • Unified Communications
  • Utility Computing
  • Wireless PANs
  • Wireless USB
Should happen in two-three years.Bears watching

  • 64-Bit Desktops
  • Application Integration
  • Business Intelligence
  • Cat 6/7 n Digital Rights Management
  • Dual-core CPUs
  • Grid Computing
  • Mobile Applications
  • OLEDs
  • RFID
  • Speech Recognition
  • WiMAX
Everyone will talk about it; early birds will implement. Others should watch this space

  • B2C E-commerce
  • Biometrics
  • Blade Servers
  • Collaboration in Apps
  • Color Lasers
  • Blog/Wiki
  • Consolidation/Virtualization
  • Data Security
  • Desktop Search
  • Development Tools
  • Dual-layered DVD Writers
  • EDGE
  • FireWire 2.0
  • Gigabit Switches
  • In-car Computing
  • IP SAN
  • Localization
  • Mobile Music Players
  • Motherboards
  • Multimedia on the Web
  • Operating Systems
  • Process/Workflow Automation
  • Smart Phones
  • Supercomputing
  • VoIP
Hot Steady Lukewarm
Will gain critical mass; happening right now. You should do it

  • Anti Spam
  • Broadband
  • Desktop GUI
  • DLP
  • Firefox
  • Hardware Design
  • Intrusion Prevention
  • LCD Monitors
  • Multi-functional Devices
  • Mobile Gaming
  • Network Infrastructure Management
  • New Generation Personal Storage
  • Notebooks
  • Open Source
  • Outsourcing
  • PCI Express
  • Smart Cards
  • VPNs
  • Web Advertising
  • Web Services
  • Wireless LANs
Implementations have happened; and continuing. You should have done it

  • Application Servers
  • B2B E-commerce
  • Back-up Technologies
  • Customer Relationship Management
  • Data Centers
  • E-governance
  • E-mail
  • Embedded Systems
  • Enterprise Business Apps (ERP + CRM + SCM) n Instant Messengers
  • Unified Threat Management
  • Online Scams
  • PC Multimedia
  • Power Conditioning
  • Routers
  • Standardization
Nothing new; old news; losing visibility

  • Accounting Software
  • Application Service Provider
  • Bluetooth n NAS
  • Portals
  • Processor Speed
  • Thin Clients n Web Cameras
Losing ground as well as visibility to newer trends and technologies

  • DMP/Line Printers
  • Inkjet Printers
  • PDAs
  • Scanners
Buzz Long term Very hot Hot Steady Lukewarm 

Anil Chopra, Geetaj Channana and Krishna Kumar


Micro Payments
Micro payments are arbitrary, small value payments made over the Internet, primarily for purchasing intangibles such as information. Obviously, it is in the B2C e-commerce space that the technological capability for making and accepting micro payments exists. There exist a number of micro payment service or technology providers. So, that also is not the issue. The problem lies in keeping the costs of individual transactions also proportionately low, both for the payer and the payee. 

We do not see that changing in the immediate future to make micro payment mechanisms accepted widely.

Biological Computing
It is generally accepted that Biological Computing could bring quantum changes in the way computing is understood today. Biological Computing is about using live cells as computer elements. Think of a computer that to be a living organism, which can be grown roughly the way you grow tomatoes or yeast, and begin to get the picture. As a topic of abiding interest, this one has been around for ages now. But we are no closer to making it a reality today; than we were say same time last year, or five years before. 

Two other related ideas that come up during discussions around this topic are Molecular Computing and Quantum Computing. Like with Biological Computing, they are also waiting for that one breakthrough which will take them from the realm of the research laboratory to that of every day usage. 

Nanotechnology is more about manufacturing than about computing. It is about manufacturing at the molecular scale. If and when nanotechnology takes off, it would impact all areas of human endeavor, including computing. As an example, imagine what it would be like to have storage devices built in the micro level. 

The first prototypes are out from the labs, but mass production is still way too far. Even when that happens, the first impact is likely to be on other areas than on computing.

Robotics is perhaps the only area where the buzz has been on since time immemorial, predating even computers. There has been a recent renewal in the activity, with the advent of cheaper, more powerful mobile platforms. 

The first working humanoid robot that appeared only this year has given a fillip to this area and we expect interest in the area to be at an all time high, leading to new developments at a faster pace. So, in about three to five years from now, Robotics should take off and go mainstream beyond the realms of automotive manufacturing and lawn movers where it is currently confined to.

Application Lifecycle
Deploying an enterprise application is a pretty intense project, and it doesn’t get over with the deployment. Lots of tasks remain ever after that, such as maintenance, upgrades, integration with other applications. Since these are all projects in themselves, the question now is ‘where does application deployment end?’ Work is on to find out the answer, but till then, it remains a buzz. 

Micro Fuel Cells
As electronic gadgets become more feature-rich and powerful, their power requirements will grow. Current technologies such as Lithium Polymer Batteries will not be sufficient to handle these requirements, thus, paving the way for micro fuel cells. These are considered as the next generation power source for mobile devices. They directly convert the chemical energy in Hydrogen or Methanol into electricity, with water vapor as the only byproduct. Companies in Japan and US are leading the research on this. 

Service Oriented Architecture. There is a success story associated with SOA-Web services, which made it possible for disparate systems to talk to each other through standard protocols. The next step is to spread this concept across application development of all types. 

Unified Communication
We use many devices today to communicate, such as phone, cellphone, fax, e-mail, IM, PDA, SMS etc. Unified communication aims to integrate all forms of communication and make it accessible from a single interface. Currently, most modes are accessible over the PC, though they still use their own proprietary interfaces. For instance, you can connect your cellphone to your PC, and install software that let’s you send/receive all SMS messages from it. But, these are not integrated into a single mailbox, and it’s going to be a major challenge to achieve that. It’s therefore still a distant dream. 

Utility Computing
Most people talk about utility computing as being analogous to electricity-a pay to use service. It treats your IT infrastructure as a single virtual resource pool that dynamically configures itself for a particular task. You pay for whatever you use without worrying about individual components such as servers, storage, network devices, etc.

Personal Area Networks
This is a network of devices such as PDA, cellphone, PC etc in close proximity of a person. Bluetooth was the major driver behind PANs, because of its non-intrusive, low-power nature. The ideal solution proposed for PAN is a piconet, which is a Bluetooth network having up to 8 active devices connected in a master-slave fashion. Sounds very exciting, but will it be widely adopted is still a question. Two factors will determine its acceptance, the availability of affordable Bluetooth in personal devices and convergence. The latter could even kill the concept altogether. For if you can satisfy all your communication and computing needs from a single device, you won’t need a PAN anymore. 

Wireless PANs 
A logical extension to personal area networks, and several technologies are contesting for this space–Bluetooth, Ultrawideband and Zigbee are just a few names that come to mind. Bluetooth based WPANs have already happened, but they didn’t make the impact they promised. Others are still under development. The release of Zigbee specs is expected as of writing this article. With so many standards fighting for this space, it’s an interesting area to watch.

Short for eXtensible Business Reporting Language, this is a relatively new concept, which aims to change the way financial data is handled by computers. Instead of treating this information as mere text, it’s treated in a different way. Standard formats are there to identify different types of financial information, so that it can be analyzed, compared, and processed as per the requirement. This can be a boon for organizations, as it will make it faster and easier to handle financial information. Its usage is still limited, and proper manifestations of the same are yet to be seen. 

Wireless USB
An extension of USB; instead of having to connect USB devices using cables, you’ll be able to do it wirelessly with the same high-speed connections as wired USB. 


Cat 6/7
It’s been over two years since the Category 6 cabling standard was formalized by the TIA. Since then, the number of structured cabling installations that use Cat 6 cables has grown. The Indian structured cabling market is expected to hit Rs 400 crore next year, as against Rs 326 crore this year. Most new installations seem to be using Cat 6 cabling instead of Cat 5e. Though Cat 6 is stable now, it will continue to grow in the years to come. Cat 7 is still a distant dream where even standards are yet to be defined. 

64-Bit Desktops
64-bit arrived earlier this year on the desktop with lots of fanfare. Apple was the first off, but AMD made the maximum noise. And Intel, who had first scoffed at the idea with disdain finally had to follow suit. What all these fine gentlemen (and ladies) forgot was that having the hardware (or even the OS) was not enough. There have to be applications available, and
compelling ones at that. 

So, while you can buy all the 64-bit hardware you want, you have only 32-bit applications to run on them. We do not expect the situation will be altered any time soon. And none of the above mentioned platform vendors have an overtly visible application development or porting program in place that could make us hopeful for the immediate future. So, till those applications come along, 64-bit desktop computing will remain a case of having the cake, but not being able to eat it.

Application Integration
A classic case of so near yet so far. Enterprise application integration has been a much talked about for a few years now. Every year, renewed efforts are made to achieve the Holy Grail of one view of the enterprise, and every year, the Grail remains as elusive as ever. Part of the problem lies in the innate complexities of the applications that you are trying to integrate. Another area where some problem lies is the often-conflicting technologies and application logic. The issues are not only that but also of conflicting cultural issues and environments and end objectives with which different divisions of the same organization work.

That said, a lot of progress has been made over the years, and as methodologies and applications themselves mature, one can expect to see an improvement in the success rate. That could, in turn, start a self fulfilling cycle, leading to more integration efforts, and in turn more maturity in architecting the applications themselves.

Business Intelligence
BI is supposed to be the next step after data warehousing and mining. It’s about making sense of all the historical data that resides in an organizations applications and applying it to decision-making. BI is a bit like ERP. It can change the internal rules of operation of the organization and to that extent is a disruptive application. Implementation assumes that you already have a good tech infrastructure and application base in place. It also means that implementation could take time and costs, and benefits could take a long time coming. And that may be the reason why it is yet to go beyond the very top tier of enterprises. Like with ERP, BI will take its time to catch on.

Dual-core CPUs
Dual-core CPUs are in all probability going to be the buzz of the year. Dual-core CPUs can loosely be considered as two processing units inside one CPU package. And it’s the processor manufacturer’s answer to their quest for increasing processor capabilities. The first of the dual cores, possibly from AMD is expected by the middle of next year, with Intel expected to follow soon.

While both AMD and Intel have identified dual core as the way ahead, dual-core CPUs from both the vendors would be differing in architectures, with Intel sharing the cache across the cores, while AMD would be opting for different caches. But it is still early days and the final implementations could be different. 

Do not expect critical mass to pick up till towards the end of the year after (2006) at the earliest, when coincidentally,
Microsoft is expected to launch Longhorn.

Grid Computing
Grid computing harnesses unused compute cycles on multiple machines to run complex problems. Till recently, even before the term grid computing came into being, the only problems being run were scientific enquiries such as the famous SETI@Home project (searching for extra terrestrial intelligence) that depended on volunteers downloading and installing specifically created client software.

The term grid computing came into being with the realization that similar free compute cycles within enterprises could be put to use to solve complex questions for the enterprise itself. A typical candidate is data warehousing, and other could be financial modeling.

While solutions for computational grids are out from majors such as IBM and Oracle, real-life implementations have been few and far. It will take at least two or three years more for enterprise grids to become slightly more commonplace. But even then it is likely that community projects such as SETI will be more in number (and definitely more visible) than the enterprise grids.

Mobile Applications
Applications that sit on your enterprise server, but display their screens on your cellphone-that is the concept behind mobile applications. The promise being that irrespective of where the users are, they will always have access to applications and data residing in the servers back in the office or in data centers across the globe. User interaction with the data could be over SMS, on a menu using the traditional cellphone keypad on the phone screen or by touch screen on a smart phone.

One of the limiting factors for mobile applications has been the limited screen estate on the cellphone. But the advent of smart phones has alleviated the problem to some extent. Applications, both from the enterprise front as well as for public service are slowly building in a mobile interface in much the same way as a Web interface was added to them many years back.

Radio Frequency identification can be used anywhere an item has to be uniquely identified and is expected to replace the more common bar code system. Unlike bar codes, RFID does not require line of sight, and more machine-readable information can be incorporated into the tags. 

The biggest opportunity for RFID lies in retail, and it has the potential to impact other areas too. RFID is already used in manufacturing, and the Walmart chain has created history of sorts by insisting that all suppliers move over to RFID. But like a single swallow does not a summer make, a single chain is not enough to herald the mainstream acceptance of RFID. In India, where even bar code scanners are yet to go mainstream, expecting RFID to achieve that feat in the short term would be expecting a miracle. 

The Organic LED (OLED) technology isn’t at all new. In fact, it was introduced ages ago by Eastman Kodak. It never managed to become commercially viable because cheaper display technologies such as CRT and LCD were available. But OLEDs hold a lot of promise for the future. Most display technology vendors such as Samsung, Sony, LG and Kodak have already announced that they will introduce OLED-based displays soon. In fact, some early birds are already in markets such as the first OLED-based 3 MegaPixel Digital camera, model LS633, launched by Kodak. Sony has introduced its Clie handheld in Japan with OLED display. So while the OLED market will grow from a multi-million to a multi-billion dollar industry, it isn’t going to replace the LCDs market any time soon. 

Wireless Interoperability for Microwave access or WiMax is basically meant to provide the last mile connectivity to broadband users through wireless. While a wireless LAN spans 100-200 meters, WiMax is more of a MAN (Metropolitan Area Network), which can span with in a radius of 50 kms. Touted as the alternative to wired broadband technologies such as DSL and cable, the WiMax standard based products are expected begin appearing by the mid of next year, meaning that widespread adoption will take even longer. 

Digital Rights Management
Passwords can be hacked and dongles cracked, audio CDs can be ripped and DVDs copied. But, all this does not stop the DRM (Digital Rights Management) movement from spreading into our lives. There exists software-related DRM, but the industry that needs it the most is the music and video industry. But, we do not see help coming their way soon. While there have been a lot of measures taken in this direction, nothing worthwhile has really emerged. WMA has its own way of handling DRM and so do other formats, but the widespread popularity of MP3 has made it very difficult for artists as well as record labels to reduce copyright infringements. They have been forced to take the passive route such as copyright notices instead of enforcing restrictions on illegal content.

The need is acute but solutions are taking their time to come. There does not seem to be any concrete solution at least in the coming year.

Speech Recognition
The technology has been around for many years now, has matured enough to be deployed in commercial applications. Speech recognition can be divided into three broad categories: desktop, embedded, and network. 

Long term, it is likely to replace the decade old IVR systems where the customers have to push too many buttons. With ASR (Automatic Speech Recognition), customers would not have to do this on the phone, but actually speak out the options. 


Development Tools
Developer tools will always remain top of the talk with the development community. With India being a developer market, it is obvious that tools will always be hot in this country. The reason we have classified them as very hot is because newer versions of most tools are just out or on the anvil. 

One major trend is that development tools are moving away from the nitty-gritties of coding to focus more on developer productivity and application architecture. The tools take care of the nuances of coding and leave the developer with time to focus on the larger issues of application architecture. Another trend is that even the most basic of tools are building in collaboration capabilities.

B2C E-commerce
We are witnessing the return of B2C e-commerce in a big way. After being scalded badly in the dotcom bust, primarily due to the lack of an enduring business model, complimented by the lack of adequate technology infrastructure, B2C is making a slow but sure comeback. Our hope this time, is the fact that what is driving it this time is not discount offers, but a more matured benefit proposition to the potential buyer, as can be witnessed from the success of models such as that of contest2win, Air Deccan and

Netbanking is another area that is taking off. ITC’s e-chaupal is yet another one. These examples may be few and far in between, but they epitomize what sustainable B2C e-commerce is about; about leveraging the net to increase customer convenience, much like the telephone does. 

Color Lasers
Color lasers have finally broken the price-quality barrier and are ready to go mainstream. 

As is happening in the inkjet segment, here too MFD’s are also making a strong play to be chosen above straightforward laser printers. But unlike in the inkjet market, the price differential between the two is still significant enough to tilt the balance, at least initially in favor of the plainer cousin.

The Blog is today an accepted facet of a digital life, much like a dairy was. And given the nature of the Internet, it was perhaps naive to not expect blogs to have a far-reaching impact. Public dairies maintained on the Internet, Web logs or blogs have become a powerful way of communication and for driving public opinion.

Companies are now using this power of the written word, freely expressed on the net, internally as a knowledge
management tool and externally as a PR mechanism. So, today, you have Jonathan Schwartz, COO of Sun Microsystems and many Microsoft employees suddenly becoming active

While the external impact of the blog is visible, it is the internal impact that is going to be more enduring. Knowledge Management is an area that is creating much heartburn inside organizations. The blog and the Wiki-a site creation tool that allows users to edit content posted by others-are powerful tools that can drive the Knowledge Management bandwagon forward. 

Desktop Search
Everybody, who is anybody in the software world, expects action here. Google, Apple, Microsoft… all of them have announced plans, if not already showcased products in the desktop search market (see our shootout elsewhere in this issue). And at least two of them, Google and Microsoft, are integrating Web search and local hard disk search into one tool.
While Google has been the one to get the maximum press so far, it is definitely not the first one to offer a desktop search tool, not by any measure. Norton Utilities even in its early avtars had a fairly good string search within files. 

At least two of the players, Apple and Microsoft, are building desktop search right into the OS, enabling to search using rich metadata information. The area is still evolving and the battle can be expected to see many interesting twists and turns before settling down, and most of it will happen in the next year or two.

Localization has been a buzz for some time now, with someone or the other, every now and then proclaiming 
the need to have content and software available in local  languages. 

It is widely accepted that localization can drive PC and in turn software purchase and use, possibly like nothing else. Meanwhile, the work of the early pioneers-CDAC and Modular-had had only limited impact, and that too in the DTP and typesetting space. Indian language versions of Office and have also not had any major impact, partly because
they are too recent.

A lot of action is now expected, as the big names get into the act of localizing their applications, but the real driver will be the availability of content in local languages. Good examples are China and Japan. 

Or take the case of the TV revolution in India. Anyone willing to do a Ramesh Sippy and make available the Ramayana or the Mahabharata in 14 Indian languages on the Internet? 

Workflow Automation
Yet another of the terms making a come back. As enterprises start spending money after a longish lull to refresh their IT infrastructure, vendors are sure to up the pitch for workflow and process automation.

Most are not likely to take the plunge immediately, preferring to update their older systems first. Could become hot next year.

The race to install the fastest supercomputer in the world can be seen as one between countries (USA vs Japan) or between vendors (IBM, SGI, NEC etc). The earth simulator from Japan and NEC, the reigning champ in this class for the last two
years, has been regulated to the third spot this time around by systems from USA and
IBM/SGI respectively.

Why this renewed focus on supercomputers? Obviously there are the bragging rights associated with having the top ranked one. But beyond that, super computing power is now attainable by clustering commercially available systems within fairly decent budgets. In turn, this means supercomputers are no longer available only to governments or to super rich enterprises. Smaller organizations can also harness their vast computing power. That is where the interest comes from. And the top ones are sort of role models or technology demonstrators, if you will. 

India has eight entries in the latest list of the top 500 supercomputers of the world, and along with China (17) and South Korea (11) are countries to watch.

Collaboration in Apps
It is no longer sufficient to communicate using e-mail or voice calls. You also need other means of communication, such as calendaring, whiteboarding, instant messaging etc to work more effectively with your team. While collaboration tools have been around for ages, they were available either as standalone applications or as huge all-in-one solutions. Now, the move is towards building collaboration capabilities inside more applications. MS Office for instance, is shifting from a standalone office suite to a collaboration package through the SharePoint Portal server. AutoCAD 2005 (reviewed in this issue) lets you export a drawing to DWF format, which others can view, mark their comments on, etc. 

Data Security
There are two aspects to security. One is to secure a host, device or network from attacks, and the second is to secure the data itself. While you might have protected your network against all attacks, what about the data that travels out of it? How secure is that? 

That’s where encryption and digital certificates among other thigs, come into play-the concepts have been around for long, but whose adoption is still very slow and will remain the same next year. 

Operating Systems
This space seems to be heating up again. Just as we are going to press, Sun has announced that Solaris 10 will be free, with payment for updates and support only. 

The initial buzz around Linux is dying down with enterprises turning towards serious implmentations. And Windows Longhorn is expected by 2006 end. All in all, a very hot area.

This has always been considered to be the poor man’s SAN because of the low cost, easy deployment and maintenance over their fiber channel counterpart. Ever since iSCSI, the main technology behind IP SANs was ratified as a standard in February 2003, lots of IP SAN products have been introduced. Deployments will be slow, and will take sometime to reach mass market. Until then, they’ll be the center of debate with one camp claiming that they’ll replace fiber channel SANs, and the other saying it will compliment them.

It’s a pain to remember all your passwords, carry all your credit cards and access cards, just to perform some simple tasks in your life. But now all these tasks can be carried out with just a thumb. Biometrics can be divided into two categories. Physical biometrics consists of techniques that scan physical characteristics such as fingerprints and facial recognition, hand geometry (scanning of hands), iris and retinal scans, vascular patterns (scanning of veins) and DNA recognition. Behavioral biometrics includes measure of non-physical characteristics such as voice recognition, signature or handwriting analysis and keystroke or patterning, which measures time spacing as you type. 

The race has begun with fingerprint recognition devices that accompany desktops, notebooks and even keyboards, but don’t expect them to become ubiquitous by next year. 

Dual-layered DVD Writers 
DVD-writers are a hot territory and fast becoming mainstream. Speeds are doubling every 4-6 months and prices keep dropping, but the latest development in this market is Dual Layer (DL) DVD writers. These writers enable you to write DVDs that carry more than 8 GB of data. One technology to watch out for in this space is BDs (Blu-ray disks). BDs are already available as a backup device by Sony that can store upto 23 GB of data on a single side of the disk, still expensive enough to be out of reach for most people.

FireWire 800 (1394b) 
FireWire was the name given to the specification that started the fast data transfer rate trend in PCs. Its first spec, namely the 1394a allowed data transfer of upto 400 Mbps over a distance of 4.5 meters. The next version, 1394b takes this up to 800 Mbps, and eventually upto 3.5 Gbps, over a distance of 100 meters. This can be accomplished by using CAT5 or plastic fiber-optic cabling between devices. But this new specification will require new types of ports also, making it a little difficult to be adopted. Thus, instead of an upgrade the applications will be divided between the two types of ports according to the needs. 

New ports, faster processors and new connectors mark the latest motherboards in the market. One of the major differentiations has come in the form of the PCI express bus as well as the LGA (Land Grid Array) processors on the Intel front. Following suite, chipset manufactures such as NVIDIA, VIA and SIS have also come out with chipsets that support these new specifications. You would be seeing implementations of these motherboards in early next year but it will still take 10-12 months for them to become mainstream and totally eliminate technologies such as PCI, parallel ATA and DDR, and thus giving way to SATA, PCIe and DDR II.

In-car Computing
From MP3s through DVD players to play stations, the cars seem to have it all. The computing in the car has gone beyond engine control to adding comforts for the passengers and car security. For instance the Licolin Aviator SUV by the Ford Motors has a GPS based anti-theft system. Though you may not see all these things getting in to mainstream in a hurry but don’t be surprised if your car adjusts the steering automatically for you without reminding you again and again.

Mobile Music Players 
On the surface it looks as if they are very popular with the Gen X, but check the figures of most of these product in the country only to find that they are not making much inroads into this market. The three basic reasons for this are the lack of retail outlets or experience in selling these
products; absence of enough ‘gadget only’ stores in the country to promote them; plus, their high costs. Finally, the cannibalization of the portable music player market by smart phones has also stifled their progress in the domestic market. 

Smart Phones 
We had mobile phones, then we had LCD displays on them, which grew bigger and started providing more functionality, and then came the color screens changing the way we look at cellphones. In the past 12-14 months, we’ve seen lot of smart phones in this market that are ready to kick PDAs away to history. You would keep seeing new phones being introduced in the area, for instance, the Nokia 9500 communicator is due for a launch in a month or so. But, still the price points and the sheer size of these devices will keep them away from being able to gain the critical mass.

Multimedia on the Web 
Macromedia played a major role in promoting multimedia rich applications on the Web, thanks to products such as Flash and
Flash player, Cold Fusion, Communications Server etc. 

While some web applications have gone that way, many are to follow. It will take time for that to happen, although a lot of talk will be there. Parallely, there is a push on now to implement such applications on mobile devices. With Flash player available for a lot of smart phones and PDAs, it is not long before we would have full-fledged multimedia enabled front ends for the mobile phone.

Enhanced Data rates for Global Evolution, EDGE is a logical extension to GPRS that can support data rates theoretically upto 384 kbps, making it the wannabe tech for the next year. At the time of writing there are barely a handful of phones in the market that support EDGE.Mobile gaming and video content like Airtel live! and Hutch TV all utilize EDGE. EDGE is not 3G. UMTS (Universal Mobile Telecommunications System) is 3G that works using W-CDMA and supports data rates of upto 1920 kbps, which has taken off across the world and should come to India in a year and a half. 

There are still some bottlenecks to its implementation such as frequency and bandwidth requirements. EDGE will be really hot by the third quarter of 2005 and could be replaced by UMTS by the end of 2006 or early 2007 in India.

Blade Servers
Blades have been around for around two years now, but are a distant dream for most organizations even today due to high cost and lack of standardization. So if you buy a blade chassis from one vendor, you can’t buy the blade servers from another. Deployment is therefore, largely restricted to where space is at a premium or in data centers. This year, some effort was put to open up blade specifications. Hopefully we should see some standards emerge next year but don’t expect the market to flourish with standardized blade servers next year itself.

Consolidation involves reducing the complexity of your IT infrastructure to make it more manageable and efficient whether it’s servers, storage or network devices. Similarly, virtualization aggregates all these elements into virtual resource pools so that you can dynamically allocate or re-allocate them based on your business requirement. Virtualization treats IT as a service, which is always there, to help you achieve your overall business objectives faster and more efficiently. While both concepts are very interesting, they’re still far from being implemented widely, possibly because there’s no set standard being followed by
the vendors for doing it and all suggest their own methods for doing it. 

Gigabit Switches
Action, in the network switches market, is back after a long time. Thanks to Gigabit Ethernet switches! Though the Gigabit switches have been in the market for many years, they’ve remained beyond the reach of most companies due to their prohibitive costs. This year their prices have crashed dramatically. Today, you can buy a 16-ports Gigabit switch for around Rs 25k. So while their deployments will increase next year, they will mostly be restricted to the network backbone, or specific applications such as IPSAN deployments. Full-fledged deployments will happen only when companies start shifting to Gigabit Ethernet, which is going to take a while.

There was a lot of buzz around this about two years ago, when TRAI opened up the Internet telephony market to ISPs. But that was just a part of VoIP, wherein you could make voice calls over the Internet. This attracted individual users wanting to save on ISD bills when calling their friends/relatives living abroad. Actually, the scope of VoIP goes far beyond that. As the name suggests, it’s the technology that allows you to carry voice and data on the same network. It has tremendous potential in an enterprise environmen. We feel there will be a more VoIP implementations next year but it’s not going to take the country by storm.So, very hot.


Mobile Gaming
Recently a friend of mine won a few thousands on a gaming competition on his cellphone. Mobile gaming is where you will see a lot happen in the coming months. This drive is being created not only by cellphone companies but also by service providers such as Airtel and Hutch, and application vendors such as Yahoo, Rediff, Indiatimes. Four factors have caused the rise in this segment. First, the penetration of Java-enabled phones with colored screens. Second, it’s easier to download and play games with faster GPRS/EDGE services. Three, Bluetooth has made it easier to play multi-player games. Finally, the memory capacity of these devices has increased multifold to hold more games, music, pictures etc. Nokia has come up with n-gage and n-gage QD cellphone cum gaming consoles. Expect a major push from service providers to encash on your mobile gaming habits in the months to come though it’s unlikely that they will reach critical mass.

Anti Spam
Want to make a quick fortune? Make an anti-spam product that really works. Or better still, create an anti-spam technology that really works and then sell it to one of the players already in the market. As the volume of spam goes up by leaps and bounds, the search for workable anti-spam solutions is also picking up momentum. In the recent past the success rate of some of these products has increased dramatically. More than the products, what will keep the category in the eye of the market would be attempts by vendors to define new standards and techniques to combat spam. There are many candidates in this area and one can expect them rejuvinate their efforts to gain acceptance, during the year ahead.

Broadband is hot not because the national broadband policy has been announced. In fact there are many who are questioning elements of the policy and if it will make a serious impact on broadband usage in the country. Broadband made a slow start a few years back, and it is only when the telephony players got into the act that some serious numbers were added to the broadband bandwagon. As telephony service providers become more and more competitive, they will have no choice but to bundle broadband along with the basic service in order to realize more revenue. That will drive the first stage of broadband adoption in the year ahead. The real surge will happen when compelling content and applications in local languages become available.

Firefox is the only product to make it to our list of hundred. That should say something about it. Firefox, the open-source browser, has its origins in Netscape’s Navigator’s failure to continue to compete with IE. Over time, IE had emerged as a better browser, and that was also the time when Navigator was a priced product while IE was free. In a last act of defiance, Netscape opened up the source code for Navigator and spun off Mozilla as an attempt to build an open-source browser. Many false starts later, Firefox (earlier named Firebird) is today emerging as a new force within the browser market and could just end up restarting the browser wars. 

At seven million downloads, it is still far, far behind the leader IE but that seven million has happened in record time, and has happened at a time when security worries about IE are at an all-time high.By the time you are reading this, Firefox 1.0 would
be out. Expect to see and hear more about this one during the year ahead.

Multifunctional Devices
Convergence is killing the inkjet printer and the desktop scanner. For long, MFDs failed to gain mainstream acceptance because of prohibitive pricing and a lack of acceptable quality. About two years back, quality, particularly on the printing front, came at par with standalone units and last year, the price barrier was also breached, leading to a huge surge in MFD usage. While there are MFDs based on both the laser and the inkjet, it is the latter that is growing in volumes, with the laser-based ones being limited as yet to the upper end of the corporate market. Granted that the number of MFDs out in the market is still way less than the installed base of say inkjet printers, but if you want an indicator to the way the market is going, then just try to remember when was it that you last saw an inkjet printer ad. 

A year back, we predicted that notebooks are going to be hot and happening. A year down the line there is enough and more happening on the notebook front. Prices have come down and capabilities have increased. And so has off take. With just under 2 Lakh units expected to be sold this year, India is still more or less a virgin market when it comes to notebook usage. And it is part of the promise that the market holds out to the vendors-the potential for rapid growth. And that potential is drawing more and more players, with offerings on both ends of the price spectrum into the arena.

Just the number of new players coming into the market is sufficient to keep the tempo in this space up. Add to that the roughly 50% plus growth in demand expected and you can easily understand why we have categorized notebooks as hot.

Open Source
Open source was very hot this year. In the year ahead, it will stabilize into the hot category, as per our definition. That is, the early birds have already been there and done that, and the rest are expected to start their experimental dips, if not actual implementations next year. And this is expected to go beyond Linux and Apache onto other open-source software, including both desktop and enterprise applications. 

For this to happen, the lure of open source will have to extend beyond that of being ‘free’ to more quantifiable benefits for the user organization. We expect these quantifiable benefits to emerge more sharply during the year ahead. A good case in point is that of the browser Firefox. Driven by a number of issues, including security concerns, the past couple of months have seen a major spike in the usage of Firefox.

This also means that open-source/free software is likely to face a critical challenge in the year ahead-that of establishing and perpetuating these quantifiable benefits. Not all such software is expected to be able to establish their value proposition and those which cannot, would fail to create an impact and are likely to loose momentum. But there would be enough left standing for the category to remain hot.

We are not talking about outsourcing in the BPO or offshoring sense. What we are talking about is the outsourcing of IT infrastructure maintenance and perhaps even the infrastructure itself to third parties.

It is not as if it is a new concept; we have been having AMC contracts ever since computers came into being, and for other equipment even before. Outsourcing the management of the infrastructure is the next step-only it has taken its own time getting there. But now that the barrier has been broken we expect more and more large enterprises to actively seek out
outsourcing options for their IT infrastructure.

Web Advertising
On the tenth anniversary of the first banner ad on the Web, advertising on the Web has made a strong comeback and is lending itself to creating opportunities that traditional advertising media like print and even TV could not. Take, for example, search engine advertising, placing sponsored links when responding to queries at search engines. It is the fastest growing of all Web-based advertising. Another strongly developing area is context sensitive advertising on content sites. Web advertising lends itself to a fairly accurate measure of who saw the message and for how long etc-one of the key measures of the advertising world, which advertisers just love. That combined with relatively lower unit costs of Web advertising, even while viewership is increasing, is what will make it hot and happening. 

What all this means is that you are likely to come across more and more advertising in the sites that you visit. Some of this may be in forms that you may not immediately realize as ads, while some may be of the in-your-face kind. Viewer discretion is recommended.

Network Infrastructure Management 
Network management products have been around for donkey’s years but only large organizations could afford them. Though things are changing and the notion of network management has now emerged into a wider concept known as Network Infrastructure Management. This is a more granular approach to network management, wherein organizations can pick and choose what they want to manage. So instead of buying a jumbo package, organizations can now choose specific modules from a larger package or go for smaller packages to separately manage say, storage, inventory, assets, support, applications, bandwidth, etc. For instance, if an organization implements SAN today it would need software to manage it, or if a business critical application such as ERP is implemented on a network, you need a network management package to ensure network uptime. Lots of new technology implementations have been carried out on the networks in the last two years. Expect the next year to see the deployment of solutions to manage them. 

Smart Cards 
They have been talked about since December 1998 when the Gujarat government started issuing smart cards based driving licenses. But the implementation has now fallen on its face because of non-maintenance of infrastructure among other things. The plight is such that the authorities do not even have smart card readers for licenses. But, when it was implemented they were one of the most tech-savvy driving licenses in the world. Many other states like Madhya Pradesh also followed suit but with mixed results. One of the most successful implementations of smart cards has been the Bharat Petroleum petrocard, which has more than 1.5 million customer base. The applications of smart cards may be immense, but their implementations are only feasible where large numbers are involved, such as in bank credit cards, public sector undertakings, govt. departments etc. We might see a handful of such implementations in the coming year. 

Desktop GUI
The first GUI to make an impact was from Apple, way back in January 1984. Since then we’ve seen a continuous improvements in the GUI. Apple continues to provide droolable GUIs, while others are also busy making theirs more feature rich. For instance, Linux is becoming easier to use, thanks to developments in KDE and Gnome-its two major desktop environments. The latest version of Genome, 2.6 for instance, has applications in voice recognition, text to speech and brail decoders, and more. Sun is also working on a Java based 3D desktop for Linux and Solaris, codenamed Project Looking Glass (see this month’s Linux section), and claims to revolutionize the desktop GUI. Finally, Windows is going for an absolute makeover with its OS Longhorn. Though this will take some time to surface, you might see some hints of its new interface in the forthcoming service packs or Win XP release 2 that Microsoft announced recently.

New Generation Personal Storage
Floppies are passé and so are CDs to a certain extent! Who would want to carry a 650 MB CD when you can have a rewritable flash storage device tucked in your pocket? With USB connectivity becoming standard, these devices have become really popular. Besides these, other personal storage options such as Flash cards and external hard drives are also gaining fame. So you’ll fine HDD-based digital audio players, digital cameras and digital audio players. The next few months will see prices of these portable devices falling, thereby increasing their penetration with their capacities shooting high. Already, 300 GB external hard drives are available. Besides these, personal storage will also be in the news because of their nuisance value. They’ve become one of the biggest security threats to organizations. So much so, that the latest version of Windows, called Longhorn, has a new PNP device policy to help the administrator authorize devices such as Flash disks in an organization.

Intrusion Prevention
This is the next logical step to its older cousin, the IDS (Intrusion Detection Systems). The IDS just warns you of a possible hack attack in progress on a specific host or your network, and does not stop it. That’s how IPSs (Intrusion Prevention Systems) came into picture since they do both. They are in a way, a combination of an IDS and a firewall, where the former identifies attacks, and the latter stops them. Today, their functionality has gone beyond these two functions to include protection against other malicious code such as viruses, spam, Trojans etc. So they serve as an all-in-on security solution for organizations as well as individuals. With security threats creeping in through every nook and corner of your computer or network, the last thing you want is to remember what all to protect. This makes IPS a very attractive option. 

Wireless LANs
There are several reasons for the stupendous growth being witnessed in this segment. WiFi product costs have come down, more and innovative WiFi products have been introduced, it’s easy to set up and has improved performance. At a cost of 2K for the wireless card and a wireless access point or router for under 10K, even small offices can deploy WiFi. In fact, they can set up complete network without using any cables at all. Even wireless print servers and Webcams are also available.

Many WLAN entertainment products such as wireless media adapter also appeared this year. Wireless multi-player gaming is possible using two wireless game adapters. You could even have wireless Internet connectivity with a wireless router. External factors such as the astonishing growth in the notebooks market have also contributed to this growth. Today, you can buy a decent notebook for the price of a desktop PC, at less than 40K. Plus, Intel has made WiFi an inherent feature in most notebooks through its Centrino technology. With so many wireless-enabled laptops, there will be more applications using them. WiFi hot spots are a likely

No Comments so far

Jump into a conversation

No Comments Yet!

You can be the one to start a conversation.