How different is the world of computing now from when the first issue of Computerworld rolled off the presses in 1967?
Here?s a glimpse: One day around that time, Edward Glaser, chairman of computer science at Case Western Reserve University, was giving some of his students a tour of the rooms that held the school?s Univac 1107. As he stood in front of the computer?s flashing lights, the sound of tape spinning in the background, Glaser said, ?By the time you?re my age, maybe 20 years from now, you?ll be able to hold all this computing power in something the size of a book.?
His students weren?t impressed. ?I remember us thinking, ?This guy is nuts,? ? says Sheldon Laube, who recently retired as CIO of PricewaterhouseCoopers. Yet Glaser was, in fact, off by only a few years and several orders of magnitude in predicting the debut and the processing power of notebook computers.
[ Keep your finger on the pulse with the latest developments in mobile technology?such as phones and tablets?in Computerworld's Mobility & Wireless newsletter ]
Today, of course, the iPhone in Laube?s pocket can do things that would overwhelm a Univac 1107 or any other multimillion-dollar computing behemoth of that era.
Thanks to the miniaturization of hardware, advances in storage processing, vast improvements in software and the proliferation of high-speed networks, computing now belongs to the people.
Over the past 45 years, ?the overarching trend is consumerization,? says technology pundit Esther Dyson, chairwoman of EDventure Holdings, an investment firm. The IT leaders who read Computerworld ?used to own all the computers, and now [their] customers do.?
This brings one practical change, she notes: more technology choices for users, who have always wanted access to information via any device and any operating system, and now expect it.
For IT, it creates a new master: ?Your 3-year-old kid can do things with your cellphone you can?t,? says Suren Gupta, executive vice president of technology and operations at Allstate. ?[IT] better be on that curve. Kids and consumers are learning technology much faster, and we need to make sure we adapt our products to reflect that.?
Technologies are created to improve life. Corporations use technologies to become more efficient and improve their ability to give customers what they want. Some corporations ? those with foresight and flexibility ? use it to create entirely new ways of doing things.
Without a doubt, high tech has reshaped the world in the past 45 years. The most visible example comes from the smart devices that millions of us keep within easy reach. Personal digital assistants, indeed ? cellphones and tablets extend our beings into a realm no less real for being virtual. But it wasn?t always this way.
Riding Moore?s Law
?My father was working on computer programming and technology back in the ?50s. He would come home and say, ?This is the hardest thing I?ve ever done. Whatever you do, stay away from these things,? ? recalls Ray Lane, a managing partner at Kleiner Perkins Caufield & Byers, a Silicon Valley venture capital firm. Lane didn?t listen to his father. After graduating from college, he became a systems analyst at IBM (he also did systems work in the military during the Vietnam War). By the early 1970s, he could write code in a formal language like Fortran (?Cobol was kind of for sissies,? he says), submit a deck of punch cards and 24 hours later find out what mistakes he?d made.
Thanks to the relentless pace of Moore?s Law, which posits that the number of transistors that can be put on a semiconductor will double every 18 months, the kind of computing power once available only to those who worked in austere information temples is now available in the palm of one?s hand, says Lane. And today, those temples ? or data centers, as they?re now known ? all look more or less the same: They?re made of servers with Intel chips inside, and they boast vast storage resources. We connect to them from anywhere, ultimately through the Internet?s protocol, TCP/IP.
Chris Perretta, CIO at State Street, remembers that he had to drop a microprocessor lab class when he was an engineering student in the late 1970s because he fried a CPU ? it was too expensive for him to get a second one. ?People get mad now when [technology] breaks, and I?m amazed that it works ever!? he jokes. At this point, Perretta says, ?we can build systems with basically infinite computing capacity and access to an incredible amount of data.?
Connected, All the Time
That we can access that data from almost anywhere is a given now, but iconic personal computers like the IBM PC came without any networking capabilities, though the Internet was more than a decade old at the time. People wanted to link those systems together, and one way they did it was through Ethernet, which was co-invented by Bob Metcalfe, founder of 3Com and a former publisher of Computerworld sister publication InfoWorld who is now teaching at the University of Texas at Austin and working as a venture capitalist at Polaris Ventures. ?Let?s say the Internet was born in 1969,? says Metcalfe via email. ?It has changed everything, and not only in computing. IBM used to run computing, and AT&T used to run communication. The Internet changed all that by 1985, [breaking up] the monopolies with open industry networking standards for PCs and networking, mainly HTML, HTTP, URL, TCP/IP, Ethernet.?
The monopolies may be gone, but it wasn?t until the past few years that nearly ubiquitous high-speed wireless Internet access became a given. ?A few years ago, when you went to conference, you sat in the front row because you were looking for a jack. Now the jack is pervasively around you,? says Robert B. Carter, CIO at FedEx. In just the past few weeks, Carter was in Malaysia, Vietnam, Europe, India, Hong Kong and Singapore. ?Never did I wonder how I was going to get a connection,? he says.
High-speed, inexpensive, generally available connections have made it easier for the masses to get access to information. They have also helped make technology usage widespread, even second nature. And these developments have led to entirely new demands on IT, moving it from a caretaker and guardian of systems to an enabler of new ways of doing business.
24/7 Data Deluge
Take healthcare, for example. Today, ?you can access on your mobile phone your patient record, your lab results, connect to a doctor and make an appointment in a matter of minutes,? says Philip Fasano, CIO at health insurer Kaiser Permanente. Don?t have a smartphone? You can do all that by voice or on a PC, too.
On the other hand, FedEx?s Carter says, ?I have a teenage daughter who manages to run out of unlimited texting. How do you do that? The math says she sends or receives a text every eight minutes, 7 by 24.? Parental conundrums aside, he says his daughter?s data usage illustrates one of the major IT issues of the day: ?How do you [comb] through that much data and make it useful to a business like FedEx??
He?s not the only one asking that question. ?At the time I started working, I couldn?t have imagined we would ever deal with gigabytes of data,? says Gupta. ?At Allstate, we?ve got almost 3 petabytes of data about different things we do with customers.? He says customers now expect just-in-time information, which means Allstate IT needs to make it possible to instantly sort through those petabytes of data and zip an answer back to the user, no matter what platform is used.
When it comes to expectations, the bar will only be set higher with the arrival of systems like IBM?s Watson, the Jeopardy-playing computer. ?Exponential growth in computer resources [allowed] us to get to things like IBM?s Watson computer,? says Fasano. ?We?re on the cusp of being able to get machines to think as quickly as we do.?
That?s a huge jump in artificial intelligence. Fasano says organizations will make the same jump in what they ?know? and can deliver to customers. For healthcare companies like Kaiser Permanente, he says, the combination of big data and big math will make it possible to develop algorithms for predictive analytics that support truly personalized medical care.
What Solow Paradox?
IT advances, of course, go through phases of wheel-spinning as they get absorbed, and big data is no different. Now, though, nobody doubts that big data will make us more productive. But it took a while for technology in general to deliver a productivity payback.
Ian S. Patterson, CIO at Scottrade in St. Louis, started his career in 1981 as a purchasing manager for a now-defunct 32-store musical instrument retailer. To check inventory, ?we?d call the stores on a weekly basis and go through what we knew were hot items and say ?How many do you have left?? ? Patterson says. Things improved a little after the chain moved from manual cash registers to IBM?s electronic registers. Still, Patterson wanted daily sales reports, but what he got took a weekend and a batch process. He got so fed up that he went back to school and got a degree in MIS.
There were, of course, early groundbreaking applications, such as American Airlines? automated reservation system in 1960, which some have called the first real-time application, and American Hospital Supply?s ASAP ordering system, launched in 1976 to allow hospital managers to place orders themselves. And there was Merrill Lynch?s 1977 money market sweeper, which automatically pulled available funds to and from money market accounts and included a credit card and check writing, and a Walmart/Procter & Gamble partnership that in 1988 produced a ?continuous? replenishment supply chain.
?Those were iconic, charismatic information systems that rendered the competition irrelevant,? says futurist and Computerworld columnist Thornton A. May. ?Some of us said, ?These are not rare or isolated or strange. IT is by definition a marketplace solvent. IT changes things.? ?
But IT doesn?t change things overnight, or by magic. For every iconic IT project, there were dozens that didn?t work out. That led to a lot of spending that didn?t yield improved productivity ? an ?emperor?s new clothes? paradox noted by Nobel Prize-winning economist Robert Solow, who in 1987 said, ?You can see the computer age everywhere but in the productivity statistics.? Time brought change, such as Michael Hammer?s re-engineering revolution in the 1990s, and technology finally started driving productivity improvements, ending the Solow Paradox.
Patterson would be much happier as a purchasing manager today. ?You?re getting to the point where once I hit point of sale, I?m pretty much at real-time update,? he says. ?In the future, it?s going to [update] once I pull it off the shelf or put it in the cart. They?re going to know when I walk up to the register, and I?m going to get my receipt.?
Interfaces Get Personal
Retailers and other companies might also be able to know what you?re looking for based on patterns you display when shopping and searching. Scottrade is working on developing algorithms that will help it create the same conversational service environment for online customers that exists in its retail operations.
Right now, Patterson says, Scottrade doesn?t have a way to find out if people visiting its website want information on opening an account because they want to save money for retirement or for their kids to go to college. A salesperson could get that information in person by simply asking a question, and Scottrade?s website will be able to do that in the future, he says.
And one day, computers might literally ask us such questions ? by speaking to us and expecting a spoken response, not by having us type words on a screen. We don?t use punch cards anymore, and we might be seeing the end of the keyboard era as well.
Today, ?kids go to their computer and put their hand on the screen and try to move stuff around,? Laube says, noting that he thinks voice will soon be the primary interface. ?That?s what Nuance just announced, building Siri into corporate apps. How cool is that??
But voice interfaces may not work so well, says Ingo Elfering, vice president of business transformation for GlaxoSmithKline?s Core Business Services unit. A native of Germany, Elfering finds that voice interfaces struggle with European accents and also fail to recognize some U.S. accents. He thinks the keyboard will remain a dominant interface, as it was with the Commodore VC 20 he purchased in 1980.
However, Elfering thinks the success of the iPad spells the end of paper. Companies will ?try to get much more digital and take out the paper. We?ll replace paper with computers that manage processes.?
And in a few years, those computers will manage processes via 3D displays, says Allstate?s Gupta, which means yet another technology for CIOs to manage.
Next-Gen CIOs
The CIOs of tomorrow will resemble today?s CIOs about as much as current CIOs resemble the old heads of MIS departments. That?s due in part to the fact that technology is now an everyday commodity ? a development that has led to the consumerization of IT in the enterprise.
Elfering says that when people have more powerful computing setups at home than they do at work, IT must confront this fundamental question: ?If more and more of what you do as an IT department becomes commodity, what does technology enable you to do that?s unique to your business??
He says the answer varies by industry and even by company, but one thing holds true everywhere: ?IT is much, much more complex and much harder to manage.?
Some of that difficulty is due to the fact that users know a lot more about IT than they once did, and they expect more from their systems. But it?s also because ?nowadays, you have all these complicated layers ? everything from SAP to Hadoop clusters to virtualized desktops to Windows and Office, to complex clusters and Web-based systems,? Elfering says.
Such a maelstrom of expectations and technology makes for turbulent times for CIOs. May puts it bluntly: CIOs need to become creative artists. Today?s Fortune 500 CIOs represent the last survivors ?of the ERP death march,? he says.
While an ERP deployment is ?an amazing feat of character and stamina,? it isn?t an act of creativity, May notes, suggesting that people who can build ERP systems have the wrong skills for a world where IT means the cloud, big data, social networks and mobile.
That?s probably true, says State Street?s Perretta. ?When I started, you would steal stuff from work to play with at home, and now you steal it from home to play with at work,? he says. Working in IT used to be about understanding technology. Now, he says, ?it?s more about what problem you?re going to solve.?
Given this shift, Perretta sees another trend that IT leaders might find hard to accept: ?I suspect the tenure of CIOs will continue to decrease.?
Professional Evolution
CIOs Hit Middle Age
Birthdays mean cake and presents and being the center of attention. But as we age, those birthday milestones cause introspection. Forty-five years after the launch of Computerworld, computing is in fabulous shape. CIOs, however, are in the midst of a midlife crisis, and trying hard to keep pace with society?s fervent embrace of technology.
While CIOs were rare 45 years ago, now their contributions and influence can?t be missed. But what stymies CIOs in this day and age? Hackers, for one. Viruses used to be biological, not technical, and hackers were hobbyists, not well-organized fraudsters. Another problem is the skills gap. IT skills used to be a ticket to a secure career. Now, technology and expectations change so rapidly that even in a tech-savvy society, it?s hard to find the right people or even know what training to give them.
In five years, when Computerworld will be celebrating its 50th anniversary, the role of the CIO will have changed again, and this time, expect it to be even more deeply integrated into most facets of the business.
? Michael Fitzgerald
Source: http://fireandsecuritynews.co.nz/45-years-of-creative-evolution-in-the-it-industry-and-beyond/
kim kardashian flour matt forte jeremy shockey new orleans saints ireland bracket vangogh
No comments:
Post a Comment