A common problem in prediction markets is:
What if you want to ask a question with no specific closing date in the future?
In other words, you know you can enter any arbitrary cut off time for the question like "the end of the year". But some report or announcement that your question is based on, could come out at anytime with the answer to your question.
And as soon as people hear about it, before the market administrator can cash out the market, now a bunch of people can make free fake money buying up a guaranteed sure thing. The traders end up looking smarter then they actually were according to their balances and winnings, and this question no longer reflects the true probability predicted by your crowd.
Often there's great prediction market questions that fall into this category.
So the team at Inkling came up with a solution to this. You can now cash out things as if they were cashed out in the past, refunding any trades people have made since then, and setting the stock price back to where it was before the answer was publicized and made known to your astute traders.
Here's a concrete example where we had to use this ourselves recently.
Awhile ago, I created the question: "Which of these companies will declare bankruptcy by 2010?" http://home.inklingmarkets.com/markets/17817 But of course, bankruptcy news just gets announced whenever. I couldn't set much of an expiration date on this except for Dec 31, 2009. And I don't watch the news all day long to cash the question out as soon as there's an announcement.
So yesterday evening I received the news Station Casinos announced bankruptcy yesterday. The news had first come out though at about 5PM Central time. So in a short span of a few hours after 5PM Central yesterday, traders had bought Station Casinos up to about $99. It's free fake money after all. They know I have to cash it out at $100.
But now with the ability to cash things out in the past, I can correct this.
I just go to cash out the stock as before, but now there is a link to click to cash out the question at some point in the past.
I pick the true date and time in the past, just before the answer to the question was publicized to everyone. I made this 5PM Central yesterday, as that seemed to be the earliest news announcement I found.
And I continue cashing out the question's answer just as before.
So now, anyone who made a trade after 5PM yesterday had their trade refunded, and the predicted price of the stock is reset back to it's value at that time.
We hope this is useful and clears up any confusion on how best to ask questions like this of a prediction market.
Crowdsourced forecasting using prediction markets. We've lived to tell the tale.
Wednesday, July 29, 2009
Tuesday, July 28, 2009
"Should we be concerned about people 'wasting' too much time using Inkling?"
A post in the New York Time's Freakanomics blog really struck a chord with me today. And it was actually commentary on a recent Paul Graham post about meetings in companies, so I wanted to share some thoughts of my own.
Because Inkling seems a bit like a game, I am often asked if companies have had problems with employees "wasting too much time" using Inkling and is there a way to have "trading hours" in the application to limit this. (No there isn't and I doubt there ever will be.)
I know why this question comes up - because Inkling is something new and unconventional, but I always privately chuckle at this question because I think back to my days working in a large company and remember how many meetings I sat in. Countless, multi-hour, meetings that looking back, were more often than not, a complete waste of time. So which is worse - spending 30 minutes per week in Inkling where you're efficiently expressing what you think about a wide range of issues, or 2 to 4 hours per day or more in meetings?
Even though I am out of the large company environment, since I am regularly talking to people about Inkling, I still spend a fair amount of time in meetings. As Graham points out, this is pretty disruptive to my productivity and I find I often don't get much "real work" done until "business hours" are over. Talking to friends, they live the same way: meetings all day, come home, eat dinner, then back to work to make up for all the time they were in meetings. Yuck.
I don't think many would disagree that Inkling is a more efficient way of collecting information (especially from a large number of people) than a meeting, but could that actually translate to fewer or shorter meetings?
Take how risk management is handled on projects across any number of industries. Most sizable projects maintain some sort of "issues log" where they identify what the project risks are and what mitigating actions they're taking. Many a meeting are held to discuss what risks are likely, which aren't, and how they should be dealt with.
In Inkling, you're already expressing your opinion on those questions without the need for lots and lots of meetings. And you can update your opinion at any time. In fact, ask about all your major project risks in Inkling and you now have a prioritized list of risks according to the likelihood they will occur:
Looks like QA testing is in the most trouble. Might be time for a meeting! But at least there is a higher likelihood that meeting can be more focused. This could translate to a shorter meeting time which most importantly translates to more time to actually work on improving the chances that testing gets done on time, or minimally a resetting of expectations.
So can prediction markets reduce the number of meetings? Given today's corporate culture, I would never be so bold to make that statement (be sure to read Paul's full post about some ideas to try, regardless.) But can they increase the quality of meetings by proving valuable, actionable input to what is being discussed? Most definitely.
Because Inkling seems a bit like a game, I am often asked if companies have had problems with employees "wasting too much time" using Inkling and is there a way to have "trading hours" in the application to limit this. (No there isn't and I doubt there ever will be.)
I know why this question comes up - because Inkling is something new and unconventional, but I always privately chuckle at this question because I think back to my days working in a large company and remember how many meetings I sat in. Countless, multi-hour, meetings that looking back, were more often than not, a complete waste of time. So which is worse - spending 30 minutes per week in Inkling where you're efficiently expressing what you think about a wide range of issues, or 2 to 4 hours per day or more in meetings?
Even though I am out of the large company environment, since I am regularly talking to people about Inkling, I still spend a fair amount of time in meetings. As Graham points out, this is pretty disruptive to my productivity and I find I often don't get much "real work" done until "business hours" are over. Talking to friends, they live the same way: meetings all day, come home, eat dinner, then back to work to make up for all the time they were in meetings. Yuck.
I don't think many would disagree that Inkling is a more efficient way of collecting information (especially from a large number of people) than a meeting, but could that actually translate to fewer or shorter meetings?
Take how risk management is handled on projects across any number of industries. Most sizable projects maintain some sort of "issues log" where they identify what the project risks are and what mitigating actions they're taking. Many a meeting are held to discuss what risks are likely, which aren't, and how they should be dealt with.
In Inkling, you're already expressing your opinion on those questions without the need for lots and lots of meetings. And you can update your opinion at any time. In fact, ask about all your major project risks in Inkling and you now have a prioritized list of risks according to the likelihood they will occur:
Question | Chance |
---|---|
Will we meet our deadline to complete QA testing? | 18% chance |
Will the supplier be able to fulfill the complete order on time? | 76% chance |
Will the cost/benefit analysis be below the threshold of moving forward with development? | 92% chance |
Looks like QA testing is in the most trouble. Might be time for a meeting! But at least there is a higher likelihood that meeting can be more focused. This could translate to a shorter meeting time which most importantly translates to more time to actually work on improving the chances that testing gets done on time, or minimally a resetting of expectations.
So can prediction markets reduce the number of meetings? Given today's corporate culture, I would never be so bold to make that statement (be sure to read Paul's full post about some ideas to try, regardless.) But can they increase the quality of meetings by proving valuable, actionable input to what is being discussed? Most definitely.
Saturday, July 18, 2009
When Bad Things Happen to Good Projects
I recently came across this 2 year old article on CIO.com about a project at Hewlett-Packard where they were implementing SAP to replace old legacy systems for their supply chain. Migrations had gone fairly smoothly until they got to a much larger department that handled customer orders. They knew this one was going to be a bigger challenge. Before the start of the migration, the project managers did their requisite contingency planning and padded their workplans to accommodate for a few glitches they foresaw with one particular legacy system. Long story short, they severely underestimated those issues and were hit by others they didn't plan for, business operations were negatively impacted, and HP pegged revenue losses at around $40M.
Another anecdote in the article was about Nike:
"Other companies besides HP have faced similar business disasters from relatively small IT errors. Nike, for example, had a problem with a demand-planning application when it switched to a centralized SAP system in 2001. The problem was tamed within a few weeks. But because the company did not have an adequate business contingency plan, the small glitch in IT cost Nike $100 million in revenue."
The article goes on to talk about the need for better contingency planning and a more thoughtful approach that considers the impacts IT snafus are going to have on operations. If you're a project manager, it's all good advice. But having managed several large projects myself in my consulting days, I learned contingency planning only gets you so far. Having a great project manager who is a Gannt chart ninja only gets you so far, having managers from the "business side" involved only gets you so far, having good team chemistry and lots of fancy collaboration tools only gets you so far, having a robust methodology only gets you so far, and even having all A-Players on your team only gets you so far.
So why do so many promising IT projects go bad?
Let's dissect at a very high level how a large IT project is run. The standard operating procedure is to do a lot of planning up front (including formulating contingency plans,) create thorough workplans that are managed throughout the project, create and manage issue tracking spreadsheets, if the project is large enough, set up a "program management office" with several project managers working in concert, set up a team collaboration space for files and dialogue, create protocols for reporting status up and down the project hierarchy, schedule weekly status meetings, and facilitate ongoing communication between different parts of your project team and the business.
That all sounds great, but obviously there's something wrong or else we wouldn't read articles like this.
One key element missing in so many of these large IT implementation projects is ongoing, unfiltered feedback from the project team itself AND from those whose business is impacted by the IT changes. On our projects at Accenture we certainly were never accused of not having enough meetings and documentation (what do you think your millions are paying for?) We used to collect status reports from every person on the team and have status meetings at least once a week. We had liaisons from the business units we were building the systems for and met with them regularly to keep them updated on the project and to hear their concerns. Sometimes they even joined our teams full-time. We had an extensive knowledge base of sample deliverables and project post-mortems. We had collaboration tools and executives coming around to QA the project every few weeks. We had Subject Matter Experts from the software vendors. Any of this sound familiar?
I would argue that you don't have to look much farther for one of the problems with IT implementations today than the ubiquitous "status meeting." You get together in a room with all the "leads" on the project, usually at the end of the week, and talk about how each of your teams are doing. Sometimes you even have the entire project team there which can be 100 people or more. (It goes without saying that no one really even wants to be there.) You highlight things that have been accomplished, what you're going to be working on next week, and talk about the critical issues you're dealing with. A good project manager will ask a lot of questions during these meetings to try and coax more information out of people, and every once in awhile some squirming and squabbling ensue. After the status meeting the project manager may send out notes and an updated project plan reflecting people's comments and how he/she has interpreted their impacts.
But is the project manager really getting the information they need to make informed decisions about the direction of the project? There's a game of self-preservation and chest thumping that often occurs in these meetings that is a filter for everyone's comments. Who wants to look like an ass in front of the people they get evaluated against for promotions and raises? Are you really going to let on that you haven't been able to work through some problem that you think will sound embarrassingly easy to others, or that you think another manager is full of it?
It turns out that us humans are all just a little egotistical, sensitive, jealous, and competitive. Those characteristics are oil in your glass full of project management water.
Meanwhile despite the project manager's best intentions and traditional information collection techniques, signals about milestones that are going to be missed or issues that are going to blow up are not revealing themselves and the project rolls on like a tank in a minefield. A project manager has to respect their "lines of communication" on the project or has to respect the larger organizational hierarchy of the company. They only hear about the big ticket issues, the issues that their direct reports are even aware of, or worse, just the excuses that make the issues sound minimal.
Now you have a project manager, no matter how talented they are, operating with large blind spots because they aren't getting proper exposure to reality. Next thing you know you're getting written up in CIO.com as a case study and your CEO is talking about $40M losses in revenue. Ouch.
What if that HP project had been running a prediction market where their entire implementation team AND people (not just liaisons) from the business units their project was affecting could trade on their project milestones? Or what if the project had taken their list of project issues and put those in a prediction market to understand the probability of them occurring? Or even better allowed people from the business side to ask their own questions about the health of the project and impacts on their operations: issues a project manager could never expect to always be on top of? I suspect HP's talented project managers would have done a lot of things differently if they had access to this new information.
There is no getting around the fact that large-scale IT system implementations are always going to be messy and complicated. Prediction markets are not only an efficient mechanism for providing actionable feedback to the project manager to hopefully avoid many of the issues and lost revenue they encountered, but serve as a credible counter-balance to the fact that we're all...human.
Another anecdote in the article was about Nike:
"Other companies besides HP have faced similar business disasters from relatively small IT errors. Nike, for example, had a problem with a demand-planning application when it switched to a centralized SAP system in 2001. The problem was tamed within a few weeks. But because the company did not have an adequate business contingency plan, the small glitch in IT cost Nike $100 million in revenue."
The article goes on to talk about the need for better contingency planning and a more thoughtful approach that considers the impacts IT snafus are going to have on operations. If you're a project manager, it's all good advice. But having managed several large projects myself in my consulting days, I learned contingency planning only gets you so far. Having a great project manager who is a Gannt chart ninja only gets you so far, having managers from the "business side" involved only gets you so far, having good team chemistry and lots of fancy collaboration tools only gets you so far, having a robust methodology only gets you so far, and even having all A-Players on your team only gets you so far.
So why do so many promising IT projects go bad?
Let's dissect at a very high level how a large IT project is run. The standard operating procedure is to do a lot of planning up front (including formulating contingency plans,) create thorough workplans that are managed throughout the project, create and manage issue tracking spreadsheets, if the project is large enough, set up a "program management office" with several project managers working in concert, set up a team collaboration space for files and dialogue, create protocols for reporting status up and down the project hierarchy, schedule weekly status meetings, and facilitate ongoing communication between different parts of your project team and the business.
That all sounds great, but obviously there's something wrong or else we wouldn't read articles like this.
One key element missing in so many of these large IT implementation projects is ongoing, unfiltered feedback from the project team itself AND from those whose business is impacted by the IT changes. On our projects at Accenture we certainly were never accused of not having enough meetings and documentation (what do you think your millions are paying for?) We used to collect status reports from every person on the team and have status meetings at least once a week. We had liaisons from the business units we were building the systems for and met with them regularly to keep them updated on the project and to hear their concerns. Sometimes they even joined our teams full-time. We had an extensive knowledge base of sample deliverables and project post-mortems. We had collaboration tools and executives coming around to QA the project every few weeks. We had Subject Matter Experts from the software vendors. Any of this sound familiar?
I would argue that you don't have to look much farther for one of the problems with IT implementations today than the ubiquitous "status meeting." You get together in a room with all the "leads" on the project, usually at the end of the week, and talk about how each of your teams are doing. Sometimes you even have the entire project team there which can be 100 people or more. (It goes without saying that no one really even wants to be there.) You highlight things that have been accomplished, what you're going to be working on next week, and talk about the critical issues you're dealing with. A good project manager will ask a lot of questions during these meetings to try and coax more information out of people, and every once in awhile some squirming and squabbling ensue. After the status meeting the project manager may send out notes and an updated project plan reflecting people's comments and how he/she has interpreted their impacts.
But is the project manager really getting the information they need to make informed decisions about the direction of the project? There's a game of self-preservation and chest thumping that often occurs in these meetings that is a filter for everyone's comments. Who wants to look like an ass in front of the people they get evaluated against for promotions and raises? Are you really going to let on that you haven't been able to work through some problem that you think will sound embarrassingly easy to others, or that you think another manager is full of it?
It turns out that us humans are all just a little egotistical, sensitive, jealous, and competitive. Those characteristics are oil in your glass full of project management water.
Meanwhile despite the project manager's best intentions and traditional information collection techniques, signals about milestones that are going to be missed or issues that are going to blow up are not revealing themselves and the project rolls on like a tank in a minefield. A project manager has to respect their "lines of communication" on the project or has to respect the larger organizational hierarchy of the company. They only hear about the big ticket issues, the issues that their direct reports are even aware of, or worse, just the excuses that make the issues sound minimal.
Now you have a project manager, no matter how talented they are, operating with large blind spots because they aren't getting proper exposure to reality. Next thing you know you're getting written up in CIO.com as a case study and your CEO is talking about $40M losses in revenue. Ouch.
What if that HP project had been running a prediction market where their entire implementation team AND people (not just liaisons) from the business units their project was affecting could trade on their project milestones? Or what if the project had taken their list of project issues and put those in a prediction market to understand the probability of them occurring? Or even better allowed people from the business side to ask their own questions about the health of the project and impacts on their operations: issues a project manager could never expect to always be on top of? I suspect HP's talented project managers would have done a lot of things differently if they had access to this new information.
There is no getting around the fact that large-scale IT system implementations are always going to be messy and complicated. Prediction markets are not only an efficient mechanism for providing actionable feedback to the project manager to hopefully avoid many of the issues and lost revenue they encountered, but serve as a credible counter-balance to the fact that we're all...human.
Subscribe to:
Posts (Atom)