Marketers have been applying behavioral economics—often unknowingly—for years. A more systematic approach can unlock significant value.
FEBRUARY 2010 • Ned Welch
Source: Marketing & Sales Practice
Long before behavioral economics had a name, marketers were using it. “Three for the price of two” offers and extended-payment layaway plans became widespread because they worked—not because marketers had run scientific studies showing that people prefer a supposedly free incentive to an equivalent price discount or that people often behave irrationally when thinking about future consequences. Yet despite marketing’s inadvertent leadership in using principles of behavioral economics, few companies use them in a systematic way. In this article, we highlight four practical techniques that should be part of every marketer’s tool kit.
1. Make a product’s cost less painful
…According to economic principle, the pain of payment should be identical for every dollar we spend. In marketing practice, however, many factors influence the way consumers value a dollar and how much pain they feel upon spending it.
Retailers know that allowing consumers to delay payment can dramatically increase their willingness to buy. … Payments, like all losses, are viscerally unpleasant. But emotions experienced in the present—now—are especially important. Even small delays in payment can soften the immediate sting of parting with your money and remove an important barrier to purchase.
Another way to minimize the pain of payment is to understand the ways “mental accounting” affects decision making. Consumers use different mental accounts for money they obtain from different sources rather than treating every dollar they own equally, as economists believe they do, or should. Commonly observed mental accounts include windfall gains, pocket money, income, and savings. Windfall gains and pocket money are usually the easiest for consumers to spend. Income is less easy to relinquish, and savings the most difficult of all. …
2. Harness the power of a default option
The evidence is overwhelming that presenting one option as a default increases the chance it will be chosen. … When we’re “given” something by default, it becomes more valued than it would have been otherwise—and we are more loath to part with it.
Savvy marketers can harness these principles. An Italian telecom company, for example, increased the acceptance rate of an offer made to customers when they called to cancel their service. Originally, a script informed them that they would receive 100 free calls if they kept their plan. The script was reworded to say, “We have already credited your account with 100 calls—how could you use those?” Many customers did not want to give up free talk time they felt they already owned.
Defaults work best when decision makers are too indifferent, confused, or conflicted to consider their options. That principle is particularly relevant in a world that’s increasingly awash with choices—a default eliminates the need to make a decision. The default, however, must also be a good choice for most people. Attempting to mislead customers will ultimately backfire by breeding distrust.
3. Don’t overwhelm consumers with choice
When a default option isn’t possible, marketers must be wary of generating “choice overload,” which makes consumers less likely to purchase. In a classic field experiment, some grocery store shoppers were offered the chance to taste a selection of 24 jams, while others were offered only 6. The greater variety drew more shoppers to sample the jams, but few made a purchase. By contrast, although fewer consumers stopped to taste the 6 jams on offer, sales from this group were more than five times higher.1
Large in-store assortments work against marketers in at least two ways. First, these choices make consumers work harder to find their preferred option… Second, large assortments increase the likelihood that each choice will become imbued with a “negative halo”—a heightened awareness that every option requires you to forgo desirable features available in some other product. Reducing the number of options makes people likelier not only to reach a decision but also to feel more satisfied with their choice.
4. Position your preferred option carefully
Economists assume that everything has a price: … each of us has a maximum price we’d be willing to pay. How marketers position a product, though, can change the equation. Consider the experience of the jewelry store owner whose consignment of turquoise jewelry wasn’t selling. … Exasperated, she gave her sales manager instructions to mark the lot down “x½” and departed on a buying trip. On her return, she found that the manager misread the note and had mistakenly doubled the price of the items—and sold the lot.2 In this case, shoppers almost certainly didn’t base their purchases on an absolute maximum price. Instead, they made inferences from the price about the jewelry’s quality, which generated a context-specific willingness to pay.
The power of this kind of relative positioning explains why marketers sometimes benefit from offering a few clearly inferior options. … Similarly, many restaurants find that the second-most-expensive bottle of wine is very popular—and so is the second-cheapest. Customers who buy the former feel they are getting something special but not going over the top. Those who buy the latter feel they are getting a bargain but not being cheap. …
Another way to position choices relates not to the products a company offers but to the way it displays them. Our research suggests, for instance, that ice cream shoppers in grocery stores look at the brand first, flavor second, and price last. Organizing supermarket aisles according to way consumers prefer to buy specific products makes customers both happier and less likely to base their purchase decisions on price… For thermostats, by contrast, people generally start with price, then function, and finally brand. The merchandise layout should therefore be quite different.
Marketers have long been aware that irrationality helps shape consumer behavior. Behavioral economics can make that irrationality more predictable. Understanding exactly how small changes to the details of an offer can influence the way people react to it is crucial to unlocking significant value—often at very low cost.
About the Author
Ned Welch is a consultant in McKinsey’s Toronto office.
The author would like to acknowledge Micah May’s contribution to this article.
Notes
1 Sheena S. Iyengar and Mark R. Lepper, “When choice is demotivating: Can one desire too much of a good thing?” Journal of Personality and Social Psychology, 2000, Volume 79, Number 6, pp. 995–1006.
2 Robert B. Cialdini, Influence: Science and Practice, New York: HarperCollins, 1993.
Thursday, February 25, 2010
A marketers guide to behavioral economics
Monday, February 22, 2010
Generation “Hexed”?
PLANSPONSOR staff editors@plansponsor.com
Michael Barry
In the spirit of a modest proposal.
Retirement “benefits” are, in real life, income paid to people who aren’t working. … There are only two ways that can, for lack of a better word, “work.”
Alternative 1, prefunding. Current workers can prefund their benefit. … That’s how 401(k) plans, for instance, work.
Alternative 2, PAYGO. Current workers can pay for the benefits of retired workers. … This is generally called a “pay as you go” (PAYGO) system. … Generation 1 invests time and income raising and educating Generation 2—during a time when, because of youth, Generation 2 cannot take care of itself. Then, Generation 2 invests income (if not time) taking care of Generation 1—during a time when, because of age, Generation 1 cannot take care of itself.
…[A] PAYGO system—such as Social Security—works fine as long as each generation reproduces itself. However, when you have the situation we do currently—a declining population (in some European countries and Japan) or, at least, a significant decrease in population growth—you have a problem financing a PAYGO system. Fewer workers are supporting more retirees—which is hard on Generation 2.
So, why don’t we make explicit the premise underlying PAYGO (that each retiring generation will depend on the productive ability of its progeny)? If you don’t reproduce (defined as raising and educating children), you don’t get a Social Security benefit. If you (“you” being defined here as a couple) don’t have (e.g., naturally or by adoption) and provide for any children, then you’ll have to pay for your own retirement. If you only have one child, you can collect half your benefit.
There are all sorts of quibbles that could be raised about this proposal: What about children who don’t survive to majority? What about foster children? What about single parents? And so on. All of these problems are solvable.
Bottom line: In a PAYGO system, people who, for whatever reason (choice, disability, or even legal impediments), do not have and raise children are free riders. When they retire, they will be living off the productive capacity of children someone else paid to raise and educate. However, … they should have more discretionary income than a comparable person who did have kids. … [They] should be required to use some of that spare cash to pay for their own retirement. …
I would extend this proposal to any social welfare system that is not means-tested and is funded on a PAYGO basis. I kind of think it would fix the (modest) problem we have with our Social Security system and the (significant) problem we have with Medicare.
Michael Barry is President of the Plan Advisory Services Group, a consulting group that helps financial services corporations with the regulatory issues facing their plan sponsor clients. He has had 30 years’ experience in the benefits field, in law and consulting firms.
Friday, February 19, 2010
A new look at carbon offsets
Carbon markets will continue to play a role in pricing—and limiting—emissions, but the opportunity in developing markets may be less promising than once expected.
FEBRUARY 2010 • Marcel Brinkman
Source: Climate Change Special Initiative
In This Article
The CFOs of any company that uses or produces energy were naturally interested in the outcome of the recent Copenhagen round of global climate negotiations, for both the potential new costs and new opportunities. Although the conference did not lead to the legally binding global carbon reduction treaty that a lot of climate watchers had hoped for, many are still watching closely as regional (rather than global) carbon markets continue to evolve. For despite the uncertainty in Copenhagen, current global carbon market arrangements will probably survive. The pricing that these markets set for carbon emission allowances will continue to be increasingly important for businesses—in particular, those facing the cost of buying allowances (so-called carbon credits) or developing projects for which carbon credits are anticipated sources of revenue.
Emission caps and related carbon trading in developed nations are a very effective way to reduce carbon emissions if supported by other forms of regulation, such as energy-efficiency standards. …
However, the role of carbon markets in developing nations (through offset financing) is still unclear and might be relatively limited compared with their role in developed nations. … Indeed, if carbon markets do not take off in developed nations in a major way, companies could be left holding credits for which there is no demand.
The economics of offset markets
Even though a global deal remains elusive, domestic and regional carbon markets will continue to grow—from slightly less than €100 billion in 2008 to around €800 billion in 2020, according to recent McKinsey estimates. The European Union, for example, already has a domestic carbon market—currently the only one of its size, with trading volumes expected to increase as the market matures and liquidity increases. The United States is poised to establish one, with climate change legislation awaiting action this year. And a number of other countries … are considering the introduction of domestic carbon markets. At the same time, multiple regional markets exist (within the United States, for example) or are being considered (as in China), mostly voluntary in nature.
Companies in these markets have a choice of reducing their own emissions to stay within their caps, buying credits from other companies, or buying international offsets. … Without a mechanism linking the various domestic carbon markets, prices, driven by local market conditions, will probably vary significantly.
The offset market plays a key role, as it is the de facto international carbon price mechanism, in the absence of direct market linkage. In theory, an originator of offset credits—say, an offset project developer—can sell its credits to a government in an Annex I country1 (which will use these credits to offset its carbon reduction commitments) or to a company in a domestic carbon market. These activities can create price arbitrage between various domestic carbon markets and the international carbon market.
Two factors hamper price equalization among the offset market, domestic carbon markets, and the global market as envisioned by the assigned amount units (AAU) established in the 1997 Kyoto Protocol on climate change.
- On the one hand, countries have limited the amount of offsets that can be imported into domestic carbon markets. For instance, the European Union will allow only 1.6 metric gigatons2 (GT) of offset credits to be imported into its market from 2008 to 2020, or on average 0.1–0.2 GT per annum. …
- On the other hand, the demand for offsets from Annex I countries is less certain, as the global market is oversupplied with “hot air,”3 which limits the need to buy offset credits. …
Offset market supply also plays a key role in offset market prices. … As the market matures, more expensive sources of abatement, often requiring an upfront investment, will be pursued. Supply will also be determined by the offset market’s future structure. …There are also concerns about the so-called additionality of project-based offsets.4
Multiple proposals have been put on the table to scale up offset markets. …The eventual supply of credits and their relative cost will be determined by the choice of mechanism, as well as the type of offset credits allowed (for example, whether they include carbon capture and storage, nuclear power, or efforts to cut emissions by reducing deforestation and the degradation of forests).
McKinsey has developed a carbon market model based on the firm’s most recent greenhouse-gas-abatement cost curve.5 … The “hard” demand for offsets is expected to be around 1.4 GT by 2020—adding up demand from domestic carbon markets, including the European carbon market and the expected US one. …
The model calculates that 2020 carbon prices in the EU emission-trading system (around €29 a ton) will be well above the price in the offset market (around €13 a ton, which reflects the exhaustion of the system’s offset quota). The US carbon market price (€16 a ton) is much closer to the offset market price. The difference results from the offset discount factor proposed in the American Clean Energy and Security Act of 2009.6
Abatement: A modest role in developing countries
The Intergovernmental Panel on Climate Change (IPCC) suggests that the global community needs to limit emissions to 44 GT in 2020 in order to limit global warming to two degrees.7 That goal would require global cuts of up to 17 GT of emissions by 2020. A large share of this decline will have to take place in developed nations, but their potential is limited to 5 GT by 2020. Faster-growing developing nations have more room to make low-carbon choices in energy efficiency and power (6 GT by 2020), as well as most of the emission reduction potential of preserved forests (roughly another 6 GT by 2020).
McKinsey’s carbon market model offers a view on the likely outcomes of the global regulatory debate, and in particular the role played by carbon markets. To do so, the model assesses the effectiveness of existing and proposed climate change regulations, including those outside the emissions directly capped by carbon markets. …
A detailed assessment of all proposals from Annex I and non–Annex I countries currently on the table8 shows that the world will be able to realize only about half of the emission reduction potential required to limit global warming to two degrees (exhibit). Of this emission potential, three GT of reductions will be achieved as domestic abatements in Annex I countries, up to two GT will be international offsets (which count toward the domestic abatement of Annex I countries), and a further three GT will be achieved by autonomous action from developing nations, potentially with financial support from Annex II nations.9
Actions currently envisioned by developing countries include a 70 percent reduction of deforestation in the Amazon rainforest by 2017 (which Brazil has proposed) and the increase of renewable power in China to 15 percent of its energy mix in 2020. … South Africa, for instance, proposes to let its emissions peak in 2025 before reducing them after 2035.
Offset demand of up to 2 GT represents significant growth compared with 2008, when 140 megatons of offset credits were issued. Yet 2 GT is a relatively modest amount in light of the up to 17 GT of abatement required to limit global warming to two degrees.
We need to be critical of this assessment, however, as the scenario modeled is only one possible outcome of ongoing discussions. … Japan has already announced a target of reducing emissions 25 percent below 1990 levels by 2020. Although that goal is conditioned on the willingness of other countries to take similarly bold action…
Furthermore, developed nations proposed substantial financial support for developing ones in the nonbinding political Copenhagen Accord: $30 billion in the period from 2010 to 2012 and up to $100 billion a year by 2020. … However, it might not be possible to achieve the recommended environmental outcome even given a more ambitious scenario with stricter national targets.
As a result of this uncertainty, companies are likely to move away from projects—such as the capture of gases other than carbon dioxide and the reduction of emissions from cooking stoves,10 which are responsible for up to 18 percent of global warming—that rely completely on offsets as their income stream. Instead, they will look for projects that also have other income streams, such as power market revenues and government subsidies, even if these projects require significantly more investment.11
About the Author
Marcel Brinkman is an associate principal in McKinsey’s London office.
Notes
1 Under the Kyoto Protocol, Annex I countries are those 37 industrialized nations that committed themselves to a reduction of greenhouse gases.
2 Metric tons: 1 metric ton = 2,205 pounds.
3 Russia, Ukraine, and various other Eastern European nations have emission caps above their current emission levels, because of the 1989 collapse of the Soviet Union. The result is a significant overhang of credits.
4 In other words, some projects might have been undertaken without any revenue from carbon credits and therefore may not have any “additional” environmental advantages.
5 McKinsey’s global greenhouse-gas-abatement cost curve assesses the technical potential to reduce carbon emissions and the cost by country, industry, and lever. For a full description, see “Pathways to a low-carbon economy,” available free of charge on mckinsey.com.
6 Sponsored by US Representatives Henry Waxman and Edward Markey, the act includes provisions on clean energy (and the transition to an economy based on it), energy efficiency, global warming, and agriculture- and forestry-related offsets.
7 This scenario assumes that carbon content in the atmosphere is reduced to 450 parts per million (ppm) by 2100, with an overshoot to 510 ppm in the intermediate period.
8 The proposals in the assessment include the recent submissions to the United Nations Framework Convention on Climate Change (January 31, 2010), the European Union’s commitment to reduce carbon emissions to 20 percent below the 1990 level by 2020, and the targets in the American Clean Energy and Security Act of 2009, passed by the US House of Representatives in 2009 and awaiting consideration by the Senate.
9 An Annex I subset of nations that have made a commitment to pay the incremental cost of mitigation and adaptation for developing (non–Annex I) nations. Annex II nations are Australia, Austria, Belgium, Canada, Denmark, the European Union, Finland, France, Germany, Greece, Iceland, Ireland, Italy, Japan, Luxembourg, the Netherlands, New Zealand, Norway, Portugal, Spain, Sweden, Switzerland, the United Kingdom, and the United States.
10 See Elisabeth Rosenthal, “Third-world stove soot is target in climate fight,” New York Times, April 15, 2009.
11 A company can claim offset income, however, only if a project is not otherwise expected to make a hurdle rate of return. The upside of such investments is therefore capped.
Wednesday, February 3, 2010
The New Golden Age
strategy+business
by Mark Stahlman
… The global economy is poised to enter a new phase of robust, dependable growth. Technological and economic historian Carlota Perez calls it a “golden age.” Such ages occur roughly every 60 years, and they last for a decade or more, part of a long cycle of technological change and financial activity. (See Exhibit 1.)
…[The] details of long cycles vary, the overall pattern of progress remains the same: An economy spends 30 years in what Perez calls “installation,” using financial capital (largely from investors) to put in place new technologies. Ultimately, overinvestment and excessive speculation lead to a financial crisis, after which installation gives way to “deployment”: a time of gradually increasing prosperity and income from improved goods and services.
This time, linchpins of the golden age will include the worldwide build-out of a new services-oriented infrastructure based on digital technology and a general shift to cleaner energy and environmentally safer technologies. In the emerging markets of China, India, Brazil, Russia, and dozens of smaller developing nations, a billion people will enter the expanding global middle class. …
Tracking the Cycle
Long cycles of technology and investment have been tracked and analyzed by an impressive roster of scholars, including Perez, Joseph Schumpeter, and others. (See “Carlota Perez: The Thought Leader Interview,” by Art Kleiner, s+b, Winter 2005.) Five such cycles have occurred since the late 1700s. The first, lasting from the 1770s through the 1820s, was based on water power and introduced factories and canals, primarily in Britain. The second, the age of steam, coal, iron, and railways, lasted from the 1820s to the 1870s. The third, involving steel and heavy engineering (the giant electrical and transportation technologies of the Gilded Age), expanded to include Germany and the United States. This cycle ended around 1910, giving way to the mass production era of the 20th century, a fourth long cycle encompassing the rise of the automobile, petroleum-based materials, the assembly line, and the motion picture and television.
Our current long cycle, which began around 1970, is based on silicon: the integrated circuit, the digital computer, global telecommunications and the Internet. … In a typical “technological–economic paradigm,” as Perez calls it, new technologies are rolled out during the first 30 years of installation with funding from financial capital. Investors are drawn in because they receive speculative gains that come, in effect, from other people making similar investments. … As some bets lead to rapid gains, enthusiasm and impatience fuel a more widespread appetite for jumping on board, risks be damned. The consequence is irrational exuberance, a crash — and then a period of crisis.
The current crisis began in 2000 with the Internet bubble collapse. It was prolonged by the financial-services industry. Not wanting to give up easy profits, and applying the technological innovations that computer “geeks” had provided, traders continued to push for rapid returns. … This culminated in the catastrophic meltdown of 2008 and a historic moment of shifting establishment priorities.
Every crisis ends in such a moment. The last crisis, which began with the stock market crash of 1929, ended with the Bretton Woods agreements of 1944. In each case, once the widespread debacle bottoms out, the speculators of the old era are reined in, expectations are reset, and new business and government elites start to rebuild the world’s governing institutions. After World War II, the locus of power and influence was the oil economy. … The symbols of elite power, including the Rockefeller-built World Trade Center, were all linked to oil.
Only with a similar restructuring can a new period of extended growth, a golden age, be ushered in. This time, the leaders will be linked to silicon. IBM, Intel, and Microsoft will be more important in the next two decades than Exxon or the World Bank. …
When deployment begins, general assumptions about business shift accordingly. Financial capital, which is relatively indifferent to particular technologies, becomes less of an economic force. Businesses depend more on industrial capital, derived from profits from the sale of goods and services. Executives with a greater interest in long-term stability than in rapid returns are placed in charge of global affairs.
There are clear signs that this is happening now. Financial regulations are being put in place around the world to improve market monitoring, limit leverage, and mandate heftier reserves. …
One telling indicator of this shift from speculation to real growth is the official attitude toward bubbles. In the 1990s, the U.S. Federal Reserve, under Alan Greenspan, took a hands-off approach to speculation. Now the Fed is discussing what actions it might take to cool off overheated markets in advance, and is admitting that its earlier approach to bubbles and risk management was a mistake. New authority is being sought by regulators such as the U.S. Commodity Futures Trading Commission and its European counterparts. …
The Emerging Silicon Economy
Goldman Sachs will probably be part of the new Silicon Establishment, along with dominant enterprises in information and communications technology and others involved in deploying these technologies. For the first time in decades, a commonality of purpose and shared reservoir of knowledge will bridge the many differences among governing bodies. … Both customers and manufacturers have learned to factor life-cycle costs and long-term plans into their decisions.
The priorities of the new technology-based elite include access to larger groups of customers, such as those in emerging nations. Thus, one hallmark of the coming golden age will be its global inclusiveness. Although oppression and slavery may remain widespread, the social systems that reinforced a “haves” and “have-nots” status quo, holding back economic opportunities for the majority of the human population, will give way. …
A new global economic infrastructure is emerging, built on networked, shared computing resources and commonly called cloud computing. … A more responsible approach to the natural environment is also gaining ground, one that advocates using energy more efficiently and reducing pollution, greenhouse gases, and hazardous waste. Meanwhile, innovative new service offerings will displace entrenched but inefficient medical and financial practices.
…For those who would like to continue rolling the dice of global finance, a more planned and regulated future will feel like an attack on freedom. Adding a billion new people to the global middle class will add to the labor arbitrage that has already begun to affect many lawyers, journalists, software engineers, and accountants. It will now affect professionals in health, finance, and education. …
After a couple of decades, the silicon era will grow moribund, as the oil era did before it. Sometime around 2030, there will be a silicon equivalent to the oil crisis of the early 1970s. Then a new long cycle will emerge. This one will probably be based on the technologies just emerging now: biotechnology and nanotechnology, along with molecular manufacturing (the ability to cheaply build any material from scratch). Then the pattern of frenzied investment will begin again, with another cycle to come.
Author Profile:
- Mark Stahlman is a Wall Street technology strategist who has been writing about tech-driven growth cycles for more than 20 years.
Tuesday, February 2, 2010
10 Trends for 2010: Piecing Together a Technology Strategy
By Samuel Greengard
2009-12-08
Despite a brutal economy and tight budgets, organizations are making plans to deploy the technologies that are most likely to drive their business in 2010. Here are 10 business and technology trends that will help solidify those plans. …
Following are the 10 most significant technology trends for next year, based on a survey of almost 1,200 technology and business managers, conducted by Ziff Davis Enterprise Research.
1 Green Computing and Energy Efficiency
… Skyrocketing energy costs and tight budgets, coupled with growing public and government pressure, have forced companies to put this issue on the front burner. …
…Better energy auditing tools, a more thorough understanding of carbon footprints, improved engineering and design, and a developing ecosystem for managing equipment from cradle to grave all make green computing more feasible.
In addition, organizations are adopting new and improved tools for managing computers and ensuring that they’re in sleep mode when they’re not in use. Many organizations are also getting serious about training employees to switch systems off when they’re not needed.
Fisher adds that manufacturers are beginning to place data about energy usage on their products, and companies are accelerating refresh cycles to take advantage of technology advances and energy savings. …
2 Public and Private Cloud Computing
… Two-thirds of Baseline survey respondents plan to expand the use of public clouds, which reside on the Internet, provide access to shared computing resources and are operated by third-party providers. Sixty-four percent said they’re interested in private clouds, which, according to the National Institute of Standards and Technology, are “owned or leased by a single organization and operated solely for that organization.” …
Organizations are also turning to clouds to keep mobile data in sync. Apple, Research in Motion and other vendors have simplified syncing contacts, e-mails, notes and calendar items across multiple devices. …
3 Virtual Desktop Infrastructure (VDI)
…Interest in VDI is growing rapidly. The technology virtualizes a desktop and stores it on a remote central server. By making desktops and data more uniform and available—across various platforms and devices in the enterprise—it’s possible to weather a natural or human disruption with minimal downtime or loss in productivity. …
4 Mobility, Telecommuting and Virtual Meetings
…Wireless networks are becoming ubiquitous, devices are advancing rapidly, and an array of tools and technologies are making virtual meetings, collaboration and telecommuting a seamless proposition. Thirty-five percent of Baseline survey respondents said they’re expecting the use of these tools to increase in 2010. …
This connected and collaborative environment also promises to usher in better desktop video conferencing, along with more advanced telepresence capabilities. The widespread availability of high-bandwidth networks, along with more sophisticated and less-expensive technologies, makes it possible for organizations to work virtually and seamlessly. …
5 Centralization, Standards and Governance
…Baseline’s survey of IT executives indicates that 85 percent of organizations will boost their investment in governance processes and applications in 2010. Mobility, managed services, cloud computing, virtualization, Web 2.0, security, SLA management and an array of other initiatives—often revolving around more effective asset management—have prompted organizations to focus on developing better governance and standardization strategies.
In addition, businesses find themselves facing a growing array of government and industry regulations. As a result, governance, risk and compliance (GRC) play an important role in corporate strategy. …
6 Knowledge Sharing, Business Intelligence and Social Networking
…Web 2.0—including blogs, wikis and social networking—has transformed the landscape and made knowledge sharing a reality. At the same time, XML-based tools and service-oriented architecture (SOA) components have made it easier and simpler to share documents and data.
… More than two-thirds of Baseline respondents indicated increased interest in social networking at their firms, and 60 percent said their companies are gravitating toward knowledge and document management applications. …
In some cases, organizations are adapting social media and combining these tools with business intelligence to provide real-time analytics on how data, information and knowledge are flowing throughout the organization—and beyond. … Other enterprises are tapping social media to assemble teams, document practices and expertise, and to identify subject matter experts who would have fallen between the cracks in the past. …
Meanwhile, many other organizations are using social networking to handle everything from sales to customer support.
7 Security, E-Discovery and Business Continuity
Cyber-security, business continuity and managing risk are all core issues for any organization. Although the Internet and increasingly sophisticated technology have created enormous business opportunities, the risk of a security breach and the threat of downtime are growing. Worse, the cost of a failure can prove catastrophic. …
Unfortunately, as the calendar rolls over to 2010, this laissez-faire attitude about security and other risk-oriented issues—including business continuity and e-discovery—could prove costly. Baseline found that 70 percent of companies expect little or no significant investment in security, and 71 percent expect little or no significant investment in business continuity. …
8 Advances in Application Infrastructure
…One of the biggest trends is the widespread use of open source code. From running operating systems to handling Web programming, it has changed the face of computing.
…Baseline found that 22 percent of IT executives expect increased investment in application infrastructure next year.
At the same time, Manes sees ongoing interest in software as a service, SOA and business process management. Major enterprise applications are also opening up through APIs, and many of them are moving into the cloud as well. Not surprisingly, mainstream software providers are tweaking and adapting their applications to keep pace with the growing demand. …
9 Investments in Hardware Infrastructure
…To be sure, organizations are looking to step up hardware and networking investments. Approximately 43 percent of respondents to the Baseline survey plan said they expect their companies to spend more on hardware, and 42 percent said their firms will increase spending on storage or storage systems.
In addition to virtualization, organizations are looking at Fibre Channel over Ethernet to build a more unified computing infrastructure. They’re also seeking more advanced management tools and investigating ways to integrate cloud computing into the internal IT environment.
An emerging trend is the use of solid-state drives, which offer greater dependability and energy savings. …
10 Collaboration, Workflow and Productivity
…The extension of productivity and workflow to the mobile environment is a huge trend. Thirty-five percent of Baseline survey respondents said that mobility systems will expand at their company in 2010. …
In fact, mobile access to SharePoint, BI, reporting dashboards, document viewers, databases and CRM apps is fast becoming the norm. …
Document and file sharing are advancing in other ways, too. About 25 percent of the survey respondents said that workflow apps will be more prominent at their companies. Thanks to technologies such as SharePoint and Adobe Flex, paper and static forms are bowing to workflow automation, data capture, e-forms, e-signatures and collaboration tools. …
How We Conducted the Research
A two-stage study was conducted for this article by Ziff Davis Enterprise Research. In the first stage, 300 technology and business professionals and managers involved in technology at organizations of all sizes were polled using an open-ended questionnaire. …
These responses were then analyzed, so that the trends that were mentioned most often could be tested in the second, quantitative stage of the study. The trends list arising out of the first stage was supplemented with input from the editors and experts to ensure completeness and clarity. In the second stage, a multiple-choice questionnaire was fielded to 878 technology and business managers in firms with at least 100 employees: 248 in firms with 100 to 499 employees, 398 in firms with 500 to 9,999 employees and 232 in firms with 10,000 or more employees. Of the 878 respondents, 230 had vice president or higher titles, 236 had director titles and 412 had manager titles.
The second-stage survey asked a series of questions about each trend in order to gauge the relative strength of each, as well as the chief factors that might be driving or potentially hindering it. The trends covered in this story are the 10 that received the strongest results because of widespread adoption, intense (highly committed) adoption or both.
Monday, February 1, 2010
Consumer misconceptions abound about funding long-term care
Published 12/2/2009
…[According] to a recent Home Instead Senior Care study … conducted by the Boomer Project (www.boomerproject.com) among 166 seniors and 444 adults, revealed that both seniors and adult children would use Social Security and Medicare to pay for senior care. The truth is, neither of these options is a viable funding mechanism for long-term care. The study participants were less likely to identify those sources of funds typically used to pay for senior care such as personal savings and retirement plans. …
“The reality is the best-laid retirement plans will be wiped out by a long-term care event,” Bill Comfort, a long-term care insurance specialist, broker and trainer who owns Comfort Assurance Group in St. Louis,says. “People fail to consider the extra costs associated with a long-term care disability in retirement, and that nothing will pay for the kind of care they want except their own money.”
The idea that Medicare and Social Security will pay for senior care is rooted in the misconception that … a government entitlement program … will cover such costs. “Many people do see the government taking care of disabled seniors in nursing homes,” Comfort says. “…Medicare only covers short-term acute and rehabilitative costs. When a nursing home is needed, Medicaid — a ‘means tested’ welfare program designed to help the poor of all ages — will pay. But that’s only when a senior has exhausted almost all of his or her own resources. And Medicaid generally pays only for care where a senior least wants to go: A certified nursing home.”
Medicaid not only requires seniors to deplete their assets, but once qualified, they must pay any remaining monthly income, including Social Security or a pension check, to the nursing home. Medicaid only pays the difference between the senior’s remaining income and the nursing home’s monthly charge. …
Comfort relates a story about a client who pays for long-term care insurance for her father as a result of an experience with her stepmother. “Her step-mother needed Alzheimer’s care and she qualified immediately for Medicaid. What the family didn’t realize is that they couldn’t choose the nursing home they wanted so she was placed farther away from her home,” Comfort says. “The daughter is paying for long-term care insurance now so that her father has more options if he needs care. …”
…“Growing older, which we all hope to do, will create some need for care, and that costs money…,” Comfort says. …
Paul Hogan is co-founder and CEO of Home Instead Senior Care. Home Instead Senior Care is among the nation’s largest providers of at-home care for seniors and has served more than 400,000 clients through a network of 800 franchise offices in the U.S. and 15 other countries. Hogan and his wife, Lori, are co-authors of Stages of Senior Care: Your Step-by-Step Guide to Making the Best Decisions (November 2009/McGraw Hill). For more information, go to http://www.stagesofseniorcare.com/.