Beta burnout

4178265189_4785cf3fa5.jpg
Another blogging platform; another next Twitter or Facebook or YouTube; another must-have smart phone app; another groundbreaking piece of hardware that will revolutionize ... After awhile, it's just so much beta burnout.

If you are trying to lead the way to whatever is next for journalism (which I suspect is true of many of the readers of the Carnival of Journalism), then you have been there and done that.

It's the kind of thing we bitch about over beers, but our Carnival host this month, Bryan Murley, has made a call to pull back the tent flaps and see the clowns without makeup:

How do you decide to dedicate time to a new tool/platform/gadget? What is the process you go through mentally? And then later - how do you convince others to go through that process? And, last: How do you ensure that the tools you do adopt are used once the "newness" factor fades?

In short, he's asking us to admit trying new tools, gizmos and websites is a one huge time suck.

There, I've said it out loud.  Big time suck.

Investigating, learning and adapting to new stuff is like throwing waking hours to the winds. Even for the things that work out.

Some of the things tried just won't work. Some provide a solution in search of a problem. Some just won't provide enough incremental benefits to warrant the pain of adoption. Some otherwise excellent products will never gain traction in the marketplace and their makers will move on (i.e. fold, change direction, or orphan).

But experiment you must, lest you end up still using a 14,400 baud modem and Windows 98 for the rest of your, indeed, wretched life. You have to resist "but we've always done it that way." There's no moving forward without adaptation, evolution and adoption.

To minimize beta burnout, be ruthless in what to try and give those your best shot.

Factors to consider for a media organization (what you do for fun is, well, what you do for fun):

  1. Does the tool/gadget/service offer the potential to significantly improve a work flow or task?
  2. Does the product/gadget/service allow you to tap what could be a significant new audience?
  3. Is it a nascent technology that is expected to be a factor in the future?
  4. Is there enough buzz to give it a chance of gaining traction.
  5. Does it provide a tangible competitive advantage.
Learn to say "no" to the rest.

The toughest thing may be having the discipline to give the new whatever-it-is a fair test. Find cases for its use. Figure out its limitations and bugs. Work out how it can be integrated into the work flow. Logging in once for a look is not a test.

There's no problem in calling the implementation of a new tool or product a test or beta effort; Google does it with widely used products for years.

Be willing to say the test was a failure and move on. Just because you've invested time and energy into a tool doesn't mean it's a good one for others to also invest in or for you to continue investing in. (In addition to not working, it's possible an even better solution will present itself). Constantly evaluate.

Remember: Adapt, evolve, adopt.

Getting others to take to a new tool is not easy. It's easiest, perhaps, if you can celebrate some wins. Training is another key. Many people in your organization will not venture to figure out something new themselves. Once they've used the new tool, make sure to celebrate their successes. Getting everyone on board up and down the org chart is another key factor that may take more than positive buzz.

The last component of our Carnival question is how to keep playing with something after Christmas Day and not let it get lost in the back of the closet. Maybe that one answers itself. If it's in the back of the closet and you don't notice it, maybe you didn't need it. It may be time to pack it off.

Did it really not accomplish the goal? Did something better come along? Was it a training or management issue?

If it is really something that the organization needs to accomplish a goal, then you have to rely on leadership to make it happen. My editor often observes: "What gets measured gets done."

Give that a test, it works.

(Photo by Moyan Brenn)



Enhanced by Zemanta