2 items tagged "ROI "

  • How To Use Competitive Intelligence To Drive Email ROI

    Naamloos

    Marketers who use competitive intelligence tools enjoy an average of three times more email generated revenue than those who don’t, according to a recent report by The Relevancy Group.

    Yet one of the most common questions I'm asked when I present a client with a competitive analysis is: "There's no point in doing this more than once a year, right?"
    Think again. There’s a lot you can -- and should -- do with competitive intelligence tools to drive ROI on a regular basis. Here's a short list to get you started:
    1. Learn from your competitor's tests, not just your own. We all talk about testing, but did you realize that you can double your efforts by gleaning ideas from competitors? If you see what works for them, you can test it for yourself. And if you see something that doesn’t work, you can deprioritize that test, and put more lucrative efforts first.
    2. Identify key subject lines, phrases, creative, etc. Chances are, if it engages your competitors’ audience, it will probably engage yours, too. It’s worth sorting through creative examples to get ideas for what you can test next.
    3. Quickly see what is new in marketing. It can be difficult to find the newest innovations, tools, or techniques that can drive your results and make your job easier. A competitive analysis tool can help you keep tabs on your competitors so you can identify when they are doing something that you can’t. Think about all the technologies we now use that were virtually unknown 10 years ago: real-time suggestion engines, dynamic image generation, and more. Just by asking, “how did they do that?”, you might uncover that your competitors are using a new tool or technique that you could implement to help drive your ROI too.
    4. Prove you need a bigger budget. A competitive tool can help you see exactly how much effort your competitors are putting into their email channel. Based on those competitive insights, you can prove that you need a bigger budget to keep up.
    5. Track benchmarks. It’s helpful to understand how you stack up against competitive benchmarks, such as read rates or share of voice. It can be even more helpful to know how those metrics change in different seasons and during different holidays. This can support your budget requests or even potentially help you restructure your program.
    Clearly there is a lot you can learn from your competitors. Once a year definitely won’t cut it if you want to keep your program fresh and continue to drive ROI. Instead, consider a two-part approach:
     
    Weekly and/or monthly: Make quick dives into the competitive tools you use to see creative changes on a regular basis. This is strictly to generate ideas that you can use to update your own testing grid. It will help you with the top three items above. A frequent check-in will keep this from taking too much time, because you’ll have enough familiarity with the competitive landscape to scroll through quickly.
     
    Bi-monthly or quarterly: Keep your more formal reporting to a less frequent schedule. This type of reporting is important because it will help you with the last two items on the list above. But it is the part that doesn’t change often. Quarterly may work, or you may decide that there are certain timeframes that are so important to your business that you need to adjust your reporting schedule around them. Even with adjustments, a formal reporting schedule shouldn’t be more often than every other month.
     
    Source: mediapost.com, November 16, 2016
  • What are the numbers, facts and figures behind big data?

    shutterstock 223785412 0Business leaders know they want to invest in big data, and they have high expectations on ROI, but do they really know what big data is?

    In Gartner’s hype cycle, the term ‘big data’ was once a staple of the yearly report. It moved swiftly into the peak of inflated expectations, weathered its way through the trough of disillusionment, and is now prevalent – somewhere between the slope of enlightenment and the plateau of productivity.

    Expectations of the technology are high, a Gartner survey in September 2015 showed more that 75% of companies are investing or planning to invest in big data in the next two years, and 37% of those projects are being driven from board level.

    But as a term, ‘big data’ still has no clear definition. For some, a dataset over a terabyte is big data – for others, it might be a million rows, and others still may have smaller datasets that is changing many times a second.

    In the era of Google, Facebook, Amazon and web-scale data, no dataset should be too difficult to analyse. It is all about having the right tool for the job.

    So instead of defining big data as a number or a size, it is more interesting and relevant to define it in terms of history, growth, compute and value.

    History

    29 million
    That’s the size, in number of records, of the world’s first big data project in 1937. At that time, the administration in America were looking to keep track of social security contributions from some 26 million Americans, and 3 million employers and partners were sought.

    IBM, with its giant punch-card machines of the time, got the contract – simultaneously setting the foundations for the Big Blue known now and setting in motion the start of automatic record keeping and data analysis on a massive scale.

    Growth

    1 Zettabyte
    For the first time since its conception, global internet traffic will surpass 1 zettabyte (1 billion terabytes) in 2016, according to a Cisco research paper, having risen fivefold in the past five years.

    A separate study estimates that 90% of the world’s data was generated in the past two years. Not only are we clicking, emailing, chatting and taking photos or videos more than ever, companies have cottoned onto the fact that data is valuable so are storing more and more data. Datasets such as website access logs and click data are no longer being thrown away – they are being archived and mined to generate valuable insights.

    Compute

    24 months
    That’s the period that Gordon Moore, co-founder of Intel, observed was the amount of time it took the number of transistors in a dense integrated circuit to double. Similar observations have been found in other areas – it has been estimated that the amount of data transmittable through an optical fibre doubles every nine months and storage density doubles roughly every 13 months.

    Organisations can process, transmit and store more data than ever, and all three are exponentially increasing commodities. Companies are better placed than ever before to deal with data.

    Value

    $59 billion
    That is the estimated value of the big data market in 2015 and it is expected to roughly double to $102 billion by 2019, according to IDC. Big data is big bucks, a 21st century gold rush – and to extend the metaphor, big data analytics is the modern-day equivalent of panning for gold.

    For data-first companies, monetisation comes in the form of advertising – and big data analytics helps them to show an appropriate advert as well as analyse the results. For other companies it is often about increasing sales (the ‘I see you bought this, what about this?’ offers), automating decisions (big data gives the proof that an option is the correct one to take) and decreasing costs (driving efficiencies in the supply chain).

    It’s clear big data is growing – it’s here to stay and, right up to board level, companies have woken up to its potential.

    Companies see that with big data analytics they can find insights that enable them to outperform their competitors and reduce costs. With the right tools, these insights are easier to obtain than ever.

    Source: Information Age

EasyTagCloud v2.8