BY KEVIN MOE
In the nearly 100 years of the Carlson School’s existence, its faculty has produced an unending stream of groundbreaking research, pushing against boundaries in both academia and in the business world at large.
Professor and General Mills/Paul S. Gerot Chair in Marketing George John’s modeling of industrial buyer-supplier ties, “Alliances in Industrial Purchasing: The Determinants of Joint Action in Buyer-Supplier Relationships” (Journal of Marketing Research, 1990), is widely known as a classic in the field. Work and Organizations Professor Avner Ben-Ner’s more recent “Treadmill Workstations: The Effects of Walking while Working on Physical Activity and Work Performance” (PLOS ONE, 2014) has many companies seriously considering adjusting their workspaces.
These are two examples of Carlson School research that moved the needle. But there are many, many others:
Accounting’s ‘Real Effects’
To the general public, accounting is often viewed as a perplexing—and boring—tangle of record keeping rules. But it is much more than that. Accounting is the yardstick by which business transactions are measured, aggregated, and reported to the capital market.
“These accounting measurements and disclosures matter in a very real sense, because how we measure and report on firms’ business transactions will alter those transactions,” says Professor Chandra Kanodia, the Arthur Andersen & Co./Kullberg Chair in Accounting and Information Systems. “Many instances of these phenomena have been studied. Mark-to-market accounting is believed to have significantly exacerbated the 2008-09 financial crises. Deficiencies in measuring investment induce firms to adopt myopic investment strategies, and accounting policy for derivatives affects firms’ risk-management strategies. The main reason for these accounting-induced real effects is that corporate managers are deeply concerned about how their decisions will ‘play’ on the capital market.”
The theory of how accounting measurements and disclosure has real effects was first developed by Kanodia more than 35 years ago in his doctoral dissertation, “Effects of Shareholder Information on Corporate Decisions and Capital Market Equilibrium” (Econometrica, 1980). Over the subsequent years, Kanodia and his students have continued to develop the real effects perspective and have applied it to numerous accounting debates.
The presence of real effects has major implications for the setting of accounting standards by the Financial Accounting Standards Board (FASB) and the Securities and Exchange Commission (SEC). Historically, these agencies have established accounting standards as if accountants were mere observers of an objective reality independent of accounting. “The key guiding principle has been that any financial information that is relevant to investors and that can be provided with sufficient reliability should be provided,” Kanodia says. “The principle is due to the belief that more information can only be better than less, and that any disclosure that moves stock prices must necessarily improve resource allocation.”
The literature on real effects actually shows much more complexity. While more transparency does give more accurate information and improves investors’ portfolio decisions, if the information causes change to firms’ fundamentals, returns to portfolios will also change and investors could be worse off.
Consistent with Kanodia’s arguments, corporations, which have lobbied against proposed changes in accounting standards, have said they would change their business strategies in response to new standards.
“But FASB has been skeptical. In academic circles, while the real effects literature was respected for its logical consistency and precision, it was considered a cry in the wilderness,” he says. “But all that has changed after the recent financial crisis. There is now a surge of interest in studying accounting-induced real effects that spans virtually all of the leading business schools.”
In addition, empirical researchers have been examining field data and finding the real effects predicted by theory. In recognition of its growing presence, the Journal of Accounting Research recently commissioned Kanodia to publish a survey of the real effects literature and its policy implications.
The Value of Voluntary Disclosures
In 1968, Professors Ray Ball and Philip Brown at the University of Chicago wrote what became the most influential paper in accounting research. “An Empirical Evaluation of Accounting Numbers” (Journal of Accounting, 1968) was the first paper in accounting to assess the usefulness of accounting information using newly developed methods in finance of looking at whether that information is impounded in stock prices.
“They found that the information in accounting net income is indeed impounded in the stock price, meaning that it is useful,” says Professor Frank Gigler. “But the information in accounting income is impounded in stock prices before the financial statements are made public, bringing into question whether providing public financial statements is really useful and suggesting that it may in fact be entirely redundant.”
Thirty years later, Gigler, Accounting Department chair and Curtis L. Carlson Chair of Accounting, and co-author, Professor Thomas Hemmer, published their first in a series of papers developing a theory that explained why the results found in Ball and Brown are exactly what one would expect when accounting is playing a “confirmatory role.” Gigler and Hemmer’s “On the Frequency, Quality, and Informational Role of Financial Reports” (Journal of Accounting Research, 1998) argued that if the accounting information in financial statements provides the discipline that makes firms’ voluntary disclosures credible, then a muted stock price reaction to the financial statements may be evidence that the financial statements are doing a good job of disciplining the voluntary disclosures.
“When markets believe what firms disclose on their own volition, the reaction to the information they are required to have audited and are mandated to disclose may be muted,” Gigler says. “But this doesn’t mean that publishing financial statements is of no value, because if it wasn’t for the threat of being caught lying that disclosing audited financial statements creates, the information that firms voluntarily disclose might not be viewed as credible.”
Gigler says that the most meaningful measure of the impact of this work came when Professor Ball wrote “The Complementary Roles of Audited Financial Statements and Voluntary Disclosure: A Test of the Confirmation Hypothesis” (Journal of Accounting and Economics, 2012), in which Gigler’s work with Hemmer provides the underlying hypothesis. “Imagine how it feels to see one of the absolute pioneers in your field adopting your thoughts more than 40 years after he wrote the paper that inspired your thinking,” he says. “Let alone seeing that his results confirm your theory!”
Solving a Debt Security Puzzle
“The Determinants of Credit Spread Changes” (Journal of Finance, 2001) by Professor and C. Arthur Williams, Jr./Minnesota Insurance Industry Chair Robert Goldstein, J. Spencer Martin, and Pierre Collin-Dufresne established an important puzzle in the world of debt securities. “My co-authors and I noticed that the empirical work on corporate bonds focused on their yields. This focus made sense because yields capture how much interest firms are paying to borrow money,” Goldstein says. “However the findings of these empirical papers were a bit ‘boring’ because changes in corporate yields are mostly driven by changes in treasury yields. When we decided to focus on credit spreads rather than on yields—credit spreads equal the difference between corporate yields and treasury yields—we discovered they had a life of their own.”
Goldstein then examined whether changes in credit spreads could be explained by changes in those factors that theory predicts should explain them, such as changes in firm value. “We found that although these predicted factors did in fact help explain spread changes, a large fraction went unexplained,” he says. “The unexplained part across firms was highly correlated. There was a corporate-bond specific factor that was driving spreads in the aggregate economy that could not be explained by the standard model.”
This research emphasized that the standard frictionless, arbitrage-free models that academics usually focus on are a bit naïve. Goldstein says this became obvious during the Great Recession when the yields of corporate bonds rose sharply, well above the “fair cost” for taking on default-risk that could be determined from credit default swap (CDS) spreads. “Since the financial crisis, the fact that assets can move far away from their fundamental value during times of financial distress has become an important part of the literature,” he says. Since publication, Goldstein’s work has been cited 1,680 times by other researchers.
Pecking at the Pecking Order Theory
Professor Murray Frank is best known for two papers that examine what drives firms’ choice of capital structure—their mix of financing, especially debt and equity. This is one of the most important topics in corporate finance, and it has been the subject of fierce debate for many years. The main debate is which of two paradigms dominates firms’ choice of capital structure.
“In the late 1990s, the finance textbooks taught that capital structure was largely determined by firms balancing tax benefits of debt against bankruptcy costs—an idea called the tradeoff theory,” Frank says. “But when professors were asked what they believed, most of them seemed to believe an alternative theory called ‘the pecking order theory.’ According to this alternative idea, firms ignore taxes and bankruptcy costs, and instead follow a financing hierarchy—first retained earnings, then debt, and only in extreme cases use equity finance.”
Murray found the pecking order theory too simplistic to really be a good description. In “Testing the Pecking Order Theory of Capital Structure” (Journal of Financial Economics, 2003), he and Vidhan Goyal found that the evidence for this view was much weaker than previously thought, and most smaller firms actually issue far more net equity than they issue net debt.
“We were able to show that the pecking order theory was not a good description of the financial choices of most firms,” Murray says. “We then did several follow-up studies to determine which factors seem to be empirically important. This had a significant impact on how the finance profession views the pecking order theory.”
One follow-up study, “Capital Structure Decisions: Which Factors are Reliably Important?” (Financial Management, 2009), took a comprehensive look at which firm characteristics have consistent effects on firms’ choice of capital structure, in particular their leverage (debt/value) ratio.
“It is now generally accepted that the hypothesized hierarchy is too simplistic, and things like taxes, bankruptcy costs, market conditions, and so forth matter more to firms when choosing their financing,” Murray says. “The old finance textbooks actually provided a better account of real firm decisions than most finance professors had realized. Many studies in recent years have built on the factors that Vidhan and I identified as being empirically important. More recent theoretical developments have also tended to build on the tradeoff theory as the foundation.”
A Bank by Any Other Name is Still a Firm
“The Theory of Bank Risk Taking and Competition Revisited” (Journal of Finance, 2005) by Professor and Banking Industry of Minnesota Chair John Boyd and Gianni De Nicolo ended up overturning conventional wisdom and has had a tremendous impact on financial policy. Until this paper was written, many banking researchers and regulators thought that increased competition among banks would lead to greater risk of bank failures, or “financial fragility.” Boyd showed that this need not be the case. Under some circumstances, greater competition makes banks safer because it gives their borrowers incentive to pursue safer strategies.
“Our research was built on an old idea, except that it had never been applied to banking before,” Boyd says. “A lot of people think banks are totally special, but they are not. They are a firm like any other firm trying to make money. Why should they be different from anyone else?”
Since this work was published, a great deal of applied research has looked to see which effect—greater fragility or greater safety—dominates in reality. The evidence is that sometimes, Boyd’s effect dominates, and sometimes it does not, leading people to consider other institutional or regulatory structures. Although this finding that “life is complicated” is not very sexy, Boyd’s research did a lot to dissuade policy makers from assuming that one size fits all.
“Our results turned conventional thinking around,” Boyd says. “What was widely believed in the literature was wrong. This research changed the way everyone looks at banking.”
Crafting a New Discipline
No mention of the Department of Information and Decision Sciences’ impact on research is complete without talking about how the Carlson School was literally at the birth of a new field. Back in 1961, Remington Rand Univac gave the school a Univac 80, a solid state business-oriented computer powerful at the time, but with a small memory and slow speed compared to today’s computers.
Emeritus Professor Gordon Davis, who joined the school that same year, was named the director of the Computing Center that housed the new machine. In the next several years, Professors Gary Dickson and Tom Hoffman joined the faculty and the three of them began to plan a formal program in the organizational use of computing—Management Information Systems.
“In the mid-1960s, every major business school had one or two faculty members interested in the use of computers for improved management of the organization, but they lacked textbooks, course materials, research support, and a community of like-minded scholars,” he says. “Minnesota pioneered the development of a new academic discipline for a new organization function to build and manage computer-based information systems. We were able to be an early innovator and leader because we had three innovative faculty, support from the dean, and support from the rest of the faculty in the business school.
“The three of us wanted solid support from information systems professionals in the Twin Cities. We believed they would play a key role in curriculum development and applied research. The link between faculty and the business world would be the Management Information Systems Research Center (MISRC).”
So, Davis, Hoffman, Dickson, and then Dean Paul Grambsch visited 30 companies in the Twin Cities metro area to present their vision. The businesses were enthusiastic about the idea, and in the summer of 1968, the MISRC came into being. Davis served as its first director.
The academic MIS program, part of the Carlson School’s Management Sciences Department, began in the fall of 1968 with 12 graduate-level courses. Ten master’s and eight PhD students enrolled in the first year. The program continued to grow and, with the reorganization of the department in 1988, merged with Decision Sciences faculty to form the Department of Information and Decision Sciences.
Nurturing the New Field
Throughout its history, the department has made huge contributions to pedagogy in the field, producing more than 45 textbooks, including Davis’ seminal work, Management Information Systems: Conceptual Foundations, Structure, and Development (1974), which is a textbook that truly deserves to be called a classic. Dickson developed and managed a summer program to retrain existing faculty in other disciplines to teach and research information systems. Many important scholars in the field were graduates of this program. Janice DeGross, a long-term staff member supporting MIS, is well known in the field for preparing and editing numerous publications, including articles, books, conference proceedings, and directories of MIS faculty.
The department has nurtured research in the field through its journal, the MIS Quarterly, regularly ranked as the most prestigious journal in the field. Dickson was the first editor-in-chief. As an example of impact, in 2003, it published “User Acceptance of Information Technology: Toward a Unified View.” Co-authored by Viswanath Venkatesh, Michael Morris, Fred Davis, and Gordon Davis, this work, although only 13 years old, has been cited more than 13,000 times by other researchers.
Minnesota was one of the leaders in forming a new community of scholars with conferences devoted to information systems in organizations. Dickson was the co-chair of the first Conference on Information Systems (now called the International Conference on Information Systems). Gordon Davis was part of the development and publication of model curricula for information systems. He also helped form a new international faculty organization, The Association for Information Systems, and served as its fourth president. Davis became the United States representative to Technical Committee 8 (Information Systems) of an international organization, the International Federation for Information Processing (IFIP). This helped Minnesota build an international network and positioning it as a leader at the international level.
The Minnesota Experiments
To examine the significance of various information systems characteristics on decision making, what was known as the Minnesota Experiments were conducted between early 1970 through 1975. A number of professors, including Gary Dickson, Norman Chervany, and Roger Schroeder, were active in this research. Initially, the research was funded by the Office of Naval Research.
“Using this funding, Dickson and I built a computer-based manufacturing experimental environment—The Production Simulator,” says Emeritus Professor Chervany. “This environment gave us the ability to manipulate the information that people used to make a series of forecasting, production, and inventory decisions.”
The first of these experiments studied the effects of information overload by comparing the decision-making results of people who had raw, non-summarized data versus people who had statistically summarized data. The study was published in Management Science in 1974 (“An Experimental Evaluation of Information Overload in a Production Environment”) and a paper summarizing this stream of research was published in the same journal in 1977 (“Research in Management Information Systems: The Minnesota Experiments”).
“This stream of research was some of the first experimental research that examined the interaction between the characteristics of actual decision makers and variations in the information to which they had access,” Chervany says.
Regional Energy Information System
Chervany also was involved in the Regional Energy Information System (REIS) project, funded by the Upper Great Lakes Regional Commission as a state-oriented response to the Mideast oil embargo in the late 1970s. He, with Associate Professor Dave Naumann, worked with the newly formed Minnesota Energy Agency on the project. Their research focused on the supply, distribution, and consumption of all forms of energy in the state, particularly in the Upper Great Lakes region. REIS processed energy data and directed energy conservation efforts based on that data, proving to be of significant value in managing Minnesota’s 1977 energy crisis.
Pioneering Beyond the MIS Field
The Minnesota faculty members in MIS have had a significant impact on the process of defining and doing a doctoral dissertation that was helpful in many disciplines in a university. Professors Sal March and Gerald Smith defined and explained an approach to research termed Design Science that has been adopted widely in many fields. Gordon Davis wrote a short monograph on the process of researching and writing the doctoral dissertation that has been used by over 70,000 doctoral students worldwide in many fields and is now in a third edition.
Is Negative Brand Information Always Bad?
With the prevalence of online rating systems for goods and services, negative information about brands is everywhere. But how do consumers process such negativity about products they have grown to know and love? Professor and Curtis L. Carlson Trust Professor of Marketing Rohini Ahluwalia set out to determine this back in 2000. Her work, “Consumer Response to Negative Publicity: The Moderating Role of Commitment” (Journal of Marketing Research, 2000), was the first systematic empirical investigation of how consumers deal with the negative information they come in contact with.
“The predominant thinking in the field was that negative information is always bad,” she says. “But there were only case studies in the literature that supported this notion. There wasn’t any empirically tested research.”
Ahluwalia and co-researchers Robert Burnkrant and H. Rao Unnava decided to put this conventional wisdom to the test and conducted three studies to examine the impact of negativity on brands. The results held a few surprises. Consumers who were committed to a specific brand instinctively argued away negative information they heard about it. Such an effect was found for low-commitment consumers as well, but to a lesser degree.
Most important for companies was that Ahluwalia’s work gave them strategies for dealing with negative information. Businesses were in the habit of using a mass approach when it came to countering negative publicity about their brands. Ahluwalia showed that a targeted approach is more effective. Committed consumers would respond more to a “diagnosticity strategy,” or reducing the information value of the negatives, e.g., by suggesting that the brand is not alone in erring, while businesses addressing a segment of consumers not committed to the brand will find more success in offering counterarguments against the negative information.
This research, now citied 816 times in the literature, laid the foundation by presenting a theoretical framework for systematically studying negative publicity and has been considered in a slew of marketing ventures, from brand positioning to political campaigns.
Bringing Meta-Analysis to Marketing Research
A highly impactful paper that brought in new methods of research to marketing was “The Effect of Price, Brand Name, and Store Name on Buyers’ Perceptions of Product Quality: An Integrative Review” (Journal of Marketing Research, 1989) by Professor and General Mills Chair in Marketing Akshay Rao and Kent Monroe.
This paper examined two issues that are arguably central to any business: what quality to deliver and what price to charge for that quality. More specifically, the paper integrates extant research to determine the extent to which customers infer quality from price and associated information that may have little to do with the actual performance of the product.
Rao and Monroe found a significant relationship between a consumer’s perceived quality of a product and its price, as well as with the brand name. However, the effect of a store name on perceived quality turned out to be small and not significant.
Rao says there are three reasons this paper has been so impactful and garnered more than 1,500 Google Scholar citations. “First, it speaks to a fundamental issue in marketing: how do consumers make quality inferences and how can a firm leverage those processes,” he says. “Second, it takes the existing wisdom on the topic—more than 50 empirical studies at the time—and integrates them using a mathematical technique called ‘meta-analysis,’ a technique that was new to the field.”
The third reason is that the research offers interesting and provocative theoretical insights that spurred further research and practical prescriptions that identified conditions under which consumers’ tendency to use price to infer quality might be leveraged by companies.
Watching Where You Stick Your Brand Name
After having done some research on brand extensions—how consumers’ beliefs about a brand transfer to a new iteration of that brand—Barbara Loken, the David C. McFarland Professor of Marketing, found a new direction to take her studies: backward. After talking to Professor and Curtis L. Carlson Chair in Marketing Deborah John after a department seminar, they decided to research how brand associations might work in reverse. Would a failed brand extension transfer harmful associations back to the parent brand name?
“We knew that this area was wide open and that no research had been performed on the topic,” Loken says. “We had also heard a branding consultant refer to this harmful transfer as brand dilution, but no one had officially coined the term yet and performed empirical work on it.”
Loken and John talked to marketing practitioners who believed their strong brand names could not be harmed by a brand extension failure or an inconsistent brand extension. But marketers in other companies said they had concerns about potentially harming their brand with a new product. “It was clear that marketers in different companies disagreed on whether brands could be harmed through extensions,” she says. “Our research showed that strong brands can be harmed. Johnson & Johnson [J&J] is the brand we studied, and we found it was vulnerable to dilution.”
However, they found that brand dilution does not always occur. “If a brand extension is moderately inconsistent with the parent brand beliefs—gentleness in the case of J&J—then the J&J parent brand is rated as lower on the gentleness belief,” she says. “If a brand extension is extremely inconsistent with the brand on more than one dimension, such as both low on gentleness and low on quality, consumers will discount the extension as extremely atypical and will not lower their brand perceptions.”
Loken and John’s findings, “Diluting Brand Beliefs: When Do Brand Extensions Have a Negative Impact?” (The Journal of Marketing, 1993), impacted the field as it was the first study to show empirically that brands could be harmed through extensions. “It altered the landscape by making practitioners more cautious about extending their brand names without care or forethought,” Loken says.
This research opened up new paths for others in marketing—both consumer behavior researchers and modelers. “Quite a lot of research has now accumulated on the topic of brand dilution, and the general topic has been extended beyond dilution due to brand extensions to dilution due to negative publicity and product recalls,” Loken says. “The research also has been used in trademark dilution research as a template for looking at the measurement of consumer confusion due to similarities of trademarks.”
Mapping Out the Innovation Journey
“One of the things that set the Carlson School on the map is the Minnesota Innovation Program that went on in the 1980s,” says Professor Andrew Van de Ven.
In essence, the program was a collaboration among 30 investigators from eight different departments across the University, including the Carlson School, agricultural and applied economics, School of Public Health, Education, Speech-Communication, and the Humphrey Institute. From 1983 to 1992, these investigators studied 14 different innovations from conception to implementation. Those professors at the Carlson School involved in the project included Van de Ven, Harold Angle, Ian Maitland, Chuck Manz, Alfred Marcus, John Mauriel, Peter Ring, Roger Schroeder, and Gary Scudder, as well as about 15 PhD candidates.
And the innovations were as varied as the investigators studying them. Cochlear implants and the commercialization of low gravity in outer space were among the topics under examination.
One of the main results of the research was “a mapping of the innovation journey,” Van de Ven says. “From conception to implementation, 12 hurdles or obstacles are commonly encountered in a wide variety of innovations. When reviewing these obstacles, one of the big ‘ahas’ for managers is that they cannot control the process, but they can learn to maneuver the journey. It is like traversing some uncharted river. You can increase your odds if you know how to swim, but that doesn’t assure you will survive getting down the river.”
The results were so impactful that they led to about 100 papers on the topic as well as more than a dozen books. Even the Carlson School’s curriculum was impacted. For several years, Van de Ven and Senior Lecturer Steve Spruth have been teaching “Managing Innovation and Change” to both undergraduates and MBA students. “Here is research that has not only made a difference in advancing science but also in teaching, and Carlson School students have been the first to learn about it.”
The idea for the innovation project came about when Van de Ven was talking to then Dean David Lilly, who was formerly president of The Toro Company. “He asked if I wanted to meet with Twin Cities executives,” Van de Ven says. “He sent a letter to his fellow CEOs and I met with 30 of them within a few weeks…I asked them all a question, ‘What keeps you up at night?’” The common response was innovation and how it could best be managed. Van de Ven suggested a collaboration—the companies would share some innovations they were starting up and he and fellow researchers would track them to see what could be learned and leveraged.
He contacted others throughout the University who were similarly interested and soon they were together exchanging findings and ideas in regular meetings, seminars, and discussions. “There have been thousands of studies on innovation, but very few have focused on the sequence of events that unfold over time. We were one of the first to do real-time tracking where we would be sitting in on meetings at 3M, Cargill, and elsewhere tracking their progress from concept to implementation,” he says.
Van de Ven asks, “Why is this important? Because most companies have many innovative new projects and projects they are trying to get off the ground. Our research findings suggest that companies can increase their odds of innovation success by about 20 percent. That is huge. It can mean the commercial viability of a company.”
Recognizing Good Knowledge
For years, Professor Shaker Zahra has studied how large corporations respond to technological and competitive changes in their industries and how many of them failed to retain their entrepreneurial spirit over time. “I found that, surprisingly, given the resources and skills these companies had, they were also heavily engaged in R&D, alliances, and joint ventures,” he says.
Zahra, the chair of the Department of Strategic Management and Entrepreneurship and the Robert E. Buuck Chair of Entrepreneurship and Professor of Strategy, found that these companies were exposed to vast amounts of knowledge about market and competitive trends and were well aware of what their competitors were doing. However, even though many of these companies were at the center of their market, they somehow failed to fully comprehend the information and knowledge they were receiving.”
“This led me to look into the absorptive capacity of these companies—the capability by which they are able to identify relevant knowledge as well as acquire, assimilate, and use it,” he says. “The concept has been around in the literature. But what I had done was to reframe it in ways that made it useful to managers by separating potential from realized capacities. Potential capability centers on knowledge accumulation. Realized capacity focuses on knowledge exploitation.”
This distinction between potential and realized capacity made it possible to see that companies can amass tremendous amounts of knowledge but still fail to use it effectively. On the other hand, some companies that are not knowledge rich do a masterful job in exploiting their knowledge in fueling innovation and entrepreneurship. “What makes the difference is what managers do to integrate and effectively deploy that knowledge,” Zahra says. “Managerial insight is important.”
Zahra’s work, “Absorptive Capacity: A Review, Reconceptualization, and Extension” (Academy of Management Review, 2002), written with Gerard George, has been widely cited—6,336 times—by scholars in entrepreneurship, strategy, international business, information systems, public administration, and political science among other fields.
“Researchers have found it useful to study the link between organizational knowledge and intelligence and innovation, entrepreneurship, and successful performance,” Zahra says.
Fostering Online Platforms to Reduce, Reuse, and Recycle
Over the last decade, state- and county-run online waste exchanges have kept reusable goods in circulation and prevented several tons of waste from crowding landfills. The Internet seems like an ideal venue to connect companies looking to get rid of surplus materials—lumber, cement, and the like—with buyers in need of the items.
“Seventy percent of everything in landfills are low-value residual items like these, so to the extent that we can get rid of these low-value items—we can repurpose them and recycle them—it will mitigate negative effects on the environment,” says Professor Kevin Linderman.
But the exchanges tend to function better in theory than in practice. And that prompted Linderman and Associate Professor Karen Donohue, along with their PhD student Suvrat Dhanorkar, to explore the factors that cause them to fail—and how they could be improved.
According to Donohue, visibility emerged as a big factor. “The listings typically don’t have pictures, so buyers can be uncertain over what they’re getting,” she says. “And even if there are detailed descriptions, buyers still might not be clear about the quality of the items.”
As Linderman explains, that leads to lack of buyer interest—and seller disengagement. “If there’s no interest, a seller might think: Why not just dump the items in a landfill?” he says. “You can see how that would be easier than holding onto excess inventory.”Their research also found this result: In countries with a wide range of reuse options, sellers had better luck on the exchanges. According to Donohue, that was one of the most surprising findings. “You might see them as competition for the exchanges,” she says. “But it appears that those competitors can create more of a market—and even a culture of wanting to reuse and recycle.”
And that can have policy implications for municipalities looking to kick-start their exchanges. “If you can create a community in which buyers are used to buying recycled items, it can help sellers ride out their listing a little longer on an exchange,” she says.
The Financial Consequences of Medical Device Recalls—Not Much
Medical devices are an indispensable and often lifesaving component in health care delivery. However, they also can be sources of significant risk to patients, because like other products, they are prone to quality failures such as manufacturing defects, functional defects, packaging errors, and software glitches. Statistics show that companies have been recalling medical devices from the market with increasing frequency.
Supply Chain and Operations Department Chair and Mosaic Company-Jim Prokopanko Professor of Corporate Responsibility K.K. Sinha and Sriram Thirumalai, a PhD alum and an associate professor at Texas Christian University, sought to assess the financial implications of medical device recalls to understand if these consequences are severe enough to deter companies from introducing potentially hazardous medical devices into the market. They also looked at a cross-section of medical device companies to examine the effects of firm characteristics on the costs of poor quality and the characteristics likely associated with device recalls.
“Contrary to conventional wisdom, the findings of the study demonstrate that the capital market penalties for medical device recalls are not significant—the costs of poor quality are not severe,” Sinha says. Although troubling to consumers, the reason that capital market costs of poor quality are not a significant deterrent to introducing potentially hazardous medical devices is the lack of a strong market reaction—markets learn to expect recalls.
Sinha says their study, “Product Recalls in the Medical Device Industry: An Empirical Exploration of the Sources and Financial Consequences” (Management Science, 2011), has motivated several follow-up lines of inquiry. First, this study had focused on market reactions in the short-term. New studies are investigating the long-term impact of recalls on shareholder wealth.
Second, other studies are exploring the mechanisms by which companies learn. “Given the high frequency of recalls across firms in the medical device industry, one follow-up study is exploring the role of learning spillover from recall experience across product categories and exploring the preventive and controls aspects of learning.”
Third, the study has motivated inquiries into the causes of a lack of strong negative reaction to medical device recalls—especially since capital markets have been known to have a strong negative reaction to automotive recalls and food recalls.
“Given that recalls are an inevitable reality of technological progress, and that medical device recalls continue to occur in spite of the rigorous product quality checks conducted by firms and government regulators such as the FDA, we need to complement the reactive approach to handling product recalls with a proactive and predictive approach so that the product manufacturers and regulatory agencies can either prevent the occurrence of recalls or minimize the disruptive effects of recalls,” Sinha says.
Years ago when the Department of Work and Organizations was called the Industrial Relations Center, much of its research involved jointly appointed faculty from other departments. One of its most significant areas of research was the Work Adjustment Project, also known as the Minnesota Studies in Vocational Rehabilitation. This project was primarily done by Bill England from industrial relations, Lloyd Lofquist from the department of psychology, psychology PhD candidate Dave Weiss, and Rene Dawis, who had feet in both industrial relations and psychology.
“This project ran from the middle 1960s through the early 1970s,” says Professor Emeritus John Fossum. “Its primary development was a psychological model matching people and jobs and all of the measurement instruments necessary to enhance the outcome.” The most notable of these measuring instruments was the Minnesota Satisfaction Questionnaire (MSQ). First published in 1967, the MSQ was designed to measure employee satisfaction with several different aspects of the work environment, such as company policies, compensation, responsibility, and achievement.
Carlson School Professor Richard Arvey made use of the MSQ in an important study done with twins in the late 1980s. In “Job Satisfaction: Environmental and Genetic Components” (Journal of Applied Psychology, 1989), Arvey queried 34 twin pairs to determine if there is a significant genetic component to job satisfaction. His findings showed job satisfaction was 30 percent genetic and 70 percent due to environmental and other factors. He also noted that twins tend to seek out similar jobs. This fact was made all the more interesting in that the twins in his study had been raised apart.
Bringing an employer perspective to the field, Carlson School Professors Herb Heneman, Jr. and Dale Yoder were heavily involved in raising professional standards in human resource management. “They co-edited an eight-volume collection of research-based practice chapters across a wide range of professional competencies human resource executives needed to command,” Fossum says. Heneman and Yoder were also heavily active with the American Society of Personnel Administrators, now known as the Society for Human Resource Management (SHRM)—so much so that the SHRM annually bestows the Herbert Heneman Jr. Award for Career Advancement.
Focusing on the Unemployed
For the last 10 to 15 years, Professor and Industrial Relations Faculty Excellence Chair Connie Wanberg has been researching how to proactively cope with unemployment and improve the success of job search behavior. Her most influential work has been “Psychological and Physical Well-Being During Unemployment: A Meta-Analytic Study” (Journal of Applied Psychology, 2005). This paper has received recognition for being in the top 1 percent in the academic field of psychiatry/psychology as well as for having 1,074 citations.
This study was important because Wanberg and coauthors Zhaoli Song, Angelo Kinicki, and Frances McKee-Ryan took all available findings across all journals and across all available countries to show in a definitive manner that unemployment has a negative effect on individual mental health. Additionally, this negative effect was shown to be due to the characteristics of the experience and not the individual. The authors also examined characteristics of the person and experience that explain if a person will have a worse or better time of it being unemployed.
“This article has been impactful because it has been useful to both researchers and practitioners. It addresses an important life event and societal issue—unemployment is a life event experienced by many individuals,” Wanberg says. “There is a strong need by agencies to understand how to best help individuals who experience job loss and how to get them back to work quickly.” Some agencies view this from a financial angle. Unemployment insurance costs taxpayers millions of dollars and the health implications alone are expensive to society.
“Other agencies are concerned about this from the individual angle—how to provide services that will help unemployed individuals find good jobs and how to alleviate the stress involved with being unemployed and without work,” she says.
Wanberg has regularly heard from practitioners that this article has been helpful to them. “An agency called JVS Work Transforms Lives in San Francisco recently received a $6.4 million grant from the Department of Labor to create programs to help the long-term unemployed,” she says. “Evidence that documents how unemployment affects individuals and who is most likely to need help is important to create cases for these programs. Jamie Austin, the vice president of finance and operations of this agency, wrote to me and noted that they drew upon my work in their proposal.”