7 items tagged "2020"

  • 15 Questions helping you plan Competitive Intelligence for 2020

    15 Questions helping you plan Competitive Intelligence for 2020

    The new year is right around the corner and, with that, 2020 planning is kicking off. For the competitive intelligence (CI) professionals out there, how are you adjusting your CI program plans? How are you prioritizing the many ways you can support competitive strategies across the company? How are you aligning your CI initiatives with the business’s key goals for the coming year? When it comes to building your competitive intelligence program plan for the new year, these 15 questions will help you establish the foundation of a winning competitive intelligence plan.

    Part 1: Competitive shifts from the past year

    The first step in establishing your 2020 CI plan is to review the competitive shifts from the past year. This will provide a backdrop to understand where the market is going and how the company has fared in the changing marketplace. In this section, answer the following questions:

    • What were the major competitive events from the last year, and what impact did they have on the market? 
      • Recap events such as acquisitions, product launches, or leadership changes from both your company and your competitors. What impact did it have on market share, brand perception, revenue, or other KPIs? What was the feedback from the market on each of these events, and how did that affect your success against your competitors?
    • Which competitors grew in prominence, and which new competitors emerged? 
      • In other words, how did the actual players change over the course of the year? Are there new competitors that are now contenders, while other competitors closed down or merged?
    • What direction are your competitors going? 
      • Based on the competitive shifts from the last year, where are your competitors going? Are they investing in new verticals, new technology, or other areas to get an edge? Look to signals like a series of product investments, recent hires, or new messaging rolled out on their website.
    • How did our competitive strategy perform? 
      • Throughout the year, your company made many decisions aimed at winning in a competitive market. How effective was that strategy? What were the key activities completed, and how did they perform? What investments were made, and how great was the return?

    Part 2: SWOT Analysis

    The next step is to analyze the competitive landscape expected for the coming year. Leveraging a SWOT analysis allows you to identify opportunities and threats, while also articulating your company’s strengths and weaknesses. Here are the questions to address in your SWOT analysis:

    • What are your company’s strengths? 
      • Identify your company’s unique strengths that allow you to compete and win in your market. Strengths can touch on product advantages, brand strengths, or even operational processes that give you a leg up in the market.
    • What are your company’s weaknesses? 
      • Identify your company’s gaps that impact your success in the market. Being honest about these weaknesses allows you to address gaps, make necessary investments, or even change the rules of the game to favor your strengths instead of your weaknesses.
    • What are the new opportunities for the coming year? 
      • Based on recent and anticipated market shifts, determine potential opportunities for business growth. Perhaps there’s a competitor losing market share whose customers are ripe for the taking. Or perhaps there’s new technology that can open up unique functionality. Outlining these opportunities will set up a productive conversation for where and how to invest CI efforts.
    • What potential threats exist going into the new year? 
      • Once again, based on recent and anticipated market shifts, determine threats to your position in the market. This could include an emerging competitor gaining traction or a set of competitors moving in on your differentiated product features. Outlining these threats will allow you to plan ahead and determine what competitive strategies are needed.

    Part 3: Resources to support CI goals

    Once you have an understanding of the competitive landscape in the last year and where there are opportunities going forward, it’s time to determine the resources needed to achieve your CI goals. Dig into the following questions in this section of your plan:

    • What people are needed to get on board? 
      • Do you need to expand your CI team to address a wider range of competitors? Do you need different skill sets on your CI team to achieve your goals?
    • What tools or technology do you need? 
      • As you invest further in your CI efforts, what can be improved with tools or technology? What technology do you want to bring on board, and why?
    • How much budget is needed to secure the above resources? 
      • What budget is needed for the necessary hires and necessary technology? Identifying these ballpark numbers can set the right expectations about the level of employees and technology needed.
    • How does the business need to support CI to be effective? 
      • What type of input or involvement do you need, and from whom, to have the maximum impact? Are there any organizational or cultural shifts that need to happen?
    • Which business investments are needed to execute a CI strategy? 
      • Are there particular investments in product, marketing, or elsewhere needed to deliver on chosen strategies to get a competitive advantage? Outline them to set expectations about the business decisions needed to follow through on your competitive strategy.

    Part 4: Alignment with business goals

    Building a CI program plan provides benefits to the CI team as well as those outside the CI team. Likely this plan will be shared with other stakeholders - executive leadership, key CI audiences, as well as collaborators. That’s why it’s critical to articulate how this plan aligns with overall business goals - doing so will allow other audiences to relate this work to what matters most to them. While there may not be many questions to answer in this section, they are among the most important to address:

    • What are the key performance indicators (KPIs) used to measure the CI program?
      • What metrics will you turn to in order to measure success?
    • How does the CI program support overall business KPIs? 
      • Connect the dots between CI success and business success.

    Part 5: Summary

    As you round out your CI program plan, it’s time to summarize all of the key points made throughout this presentation. This includes summarizing key competitive shifts from the past year, opportunities and threats for the coming year, key priorities and metrics aligned with them, and any resources needed to succeed. Diving into each of these areas gives both competitive intelligence professionals and their stakeholders visibility into competitive shifts and the strategies needed to win in complex markets.

    Author: Ellie Mirman

    Source: Crayon

  • Gartner's nieuwjaars resoluties voor CIO's in 2020

    Gartner's nieuwjaarsresoluties voor CIO's in 2020

    Voordat CIO's allerlei nieuwe initiatieven in het komende jaar gaan ontplooien, is het nuttig om zowel de mentale en fysieke ruimte op te ruimen om plek te creëren voor die nieuwe dingen, adviseert Gartner.

    Zoals de analistenfirma in haar 2020 CIO Resoluties rapport aangeeft, kan dit echter een uitdaging zijn voor de huidige technologische en digitale leiders.

    Ondanks de stijgende IT-budgetten zijn veel IT-afdelingen ondergefinancierd omdat verzoeken voor IT de beschikbare budgetten overstijgen, melden de analisten van Gartner en de auteurs van het rapport Mark Raskino, Mary Mesaglio en Tina Nunno.

    'De druk om meer te doen met minder, of om gewoon meer te doen, kan leiden tot aanzienlijke fysieke en mentale vermoeidheid', schrijven ze.

    'CIO's moeten een omgeving creëren waar zowel zij als hun afdelingen mentale ballast kunnen elimineren en het nieuwe jaar energiek beginnen'.

    Hun aanbeveling? Laat onnuttige ballast vallen of doe spullen weg die steeds minder opbrengen of weinig waarde toevoegen.

    Een manier is om het kantoor te legen en alleen spullen terug te brengen waar CIO's zich positief over voelen.

    'De ruimte herschikken voor een nieuw perspectief', zeggen de analisten.

    De hele IT-afdeling kan ook een opruimdag inplannen.

    Gartner adviseert ook om de achterstand van initiatieven die zwaar drukt op de afdeling, weg te werken en om beleid, processen of procedures die geen waarde toevoegen, op te ruimen.

    Op deze manier kunnen CIO's 'fris beginnen met een toekomstgericht plan'.

    De auteurs van het rapport roepen ook op om een einde te maken aan het 'opendeurbeleid' van de CIO.

    'Degenen die gebruik maken van een openstaande kantoordeur zijn niet altijd degenen die een CIO het meest nodig heeft', merken ze op.

    In plaats daarvan moeten CIO's zich beschikbaar stellen aan 'strategische genodigden' of high performers van verschillende niveaus en afdelingen.

    'Delegeer, machtig en stel je leiderschapsteam in staat om de andere bezoekers in het spel te brengen', benadrukken ze.

    Langzaam en niet snel denken

    Een van hun topadviezen voor CIO's is om 'analoog te denken'.

    'Langzaam denken, niet snel denken, overwint complexe problemen en vindt originele oplossingen', stellen ze. 'Je geest heeft hulp nodig om harder te werken en substantiëler te denken. Soms helpen minder rigide, langzamer en meer fysieke denkgereedschappen'.

    Herinner je je de vulpen? 'Gebruik het om vier of vijf cultuurverschuivende zinnen op te stellen die het komende jaar vaak herhaald moeten worden', stellen ze.

    Soms helpen minder stijve, langzamere en meer fysieke denkgereedschappen.

    CIO's wordt geadviseerd om schrijfpapier van topkwaliteit te gebruiken en deze woorden als een kalligraaf te schrijven om zeer gedenkwaardige aforismen te creëren.

    Een andere aanpak is om de hele kantoormuur te gebruiken als 'denkruimte', met details van hun plannen voor 2020.

    Dit is een gangbare praktijk bij digitale reuzen en starters, meldt Gartner.

    Zij adviseren ook om 'uw volgende organisatie in LEGO te ontwerpen'.

    Haal een grote grondplaat en een doos met minifiguurtjes. Deze laatste zullen de 'metaforische bouwvakkers, duikers en superhelden' in de organisatie vertegenwoordigen.

    Net als in voorgaande jaren eindigt Gartner met deze eeuwigdurende resolutie - om tijd te maken voor een directe ervaring met nieuwe technologieën.

    Deze kunnen variëren van alledaagse Internet of Things (IoT)-producten, tot low-code-platforms die 'citizen developers' in de organisatie in staat stellen, evenals opkomende technologieën die specifiek zijn voor hun respectievelijke industrieën.

    Dit is belangrijk omdat te midden van de drukke en verschuivende rol van de huidige CIO's, ze naar verwachting ook de aanzet zullen geven tot futuristische zakelijke deals en inzichten zullen delen over belangrijke opkomende technologieën, concludeert Gartner.

    Auteur: Divina Paredes

    Bron: CIO

  • The (near) future of data storage

    The (near) future of data storage

    As data proliferates at an exponential rate, companies must not only store it. They must approach Data Management expertly and look to new approaches. Companies that take new and creative approaches to data storage will be able to transform their operations and thrive in the digital economy.

    How should companies approach data storage in the years to come? As we look into our crystal ball, here are important trends in 2020. Companies that want to make the most of data storage should be on top of these developments.

    A data-centric approach to data storage

    Companies today are generating oceans of data, and not all of that data is equally important to their function. Organizations that know this, and know which pieces of data are more critical to their success than others, will be in a position to better manage their storage and better leverage their data.

    Think about it. As organizations deal with a data deluge, they are trying hard to maximize their storage pools. As a result, they can inadvertently end up putting critical data on less critical servers. Doing so is a problem because it typically takes longer to access data on slower, secondary machines. It’s this lack of speed and agility that can have a detrimental impact on businesses’ ability to leverage their data.

    Traditionally organizations have taken a server-based approach to their data backup and recovery deployments. Their priority is to back up their most critical machines rather than focusing on their most business-critical data.

    So, rather than having backup and recovery policies based on the criticality of each server, we will start to see organizations match their most critical servers with their most important data. In essence, the actual content of the data will become more of a decision-driver from a backup point of view.

    The most successful companies in the digital economy will be those that implement storage policies based not on their server hierarchy but on the value of their data.

    The democratization of flash storage

    With the continuing rise of technologies like IoT, artificial intelligence, and 5G, there will be an ever-greater need for high-performance storage. This will lead to the broader acceptance of all-flash storage. The problem, of course, is that flash storage is like a high-performance car: cool and sexy, but the price is out of reach for most.

    And yet traditional disk storage simply isn’t up to the task. Disk drives are like your family’s old minivan: reliable but boring and slow, unable to turn on a dime. But we’re increasingly operating in a highly digital world where data has to be available the instant it’s needed, not the day after. In this world, every company (not just the biggest and wealthiest ones) needs high-performance storage to run their business effectively.

    As the cost of flash storage drops, more storage vendors, are bringing all-flash arrays to the mid-market and more organizations will be able to afford this high-performance solution. This price democratization will ultimately enable every business to benefit from technology.

    The repatriation of cloud data

    Many companies realize that moving to the cloud is not as cost-effective, secure, or scalable as they initially thought. They’re now looking to return at least some of their core data and applications to their on-premises data centers.

    The truth is that data volumes in the cloud have become unwieldy. And organizations are discovering that storing data in the cloud is not only more expensive than they thought but It’s also hard to access that data expeditiously due to the cloud’s inherent latency.

    As a result, it can be more beneficial in terms of cost, security, and performance to move at least some company data back on-premises.

    Now that they realize the cloud is not a panacea, organizations are embracing the notion of cloud data repatriation. They’re increasingly deploying a hybrid infrastructure in which some data and applications remain in the cloud, while more critical data and applications come back home to an on-premises storage infrastructure.

    Immutable storage for businesses of all sizes

    Ransomware will continue to be a scourge to all companies. Because hackers have realized that data stored on network-attached storage devices is extremely valuable, their attacks will become more sophisticated and targeted. This is a serious problem because backup data is typically the last line of defense. Hackers are also attacking unstructured data. The reason is that if the primary and secondary (backup) data is encrypted, businesses will have to pay the ransom if they want their data back. This increases the likelihood that an organization, without a specific and immutable recovery plan in place, will pay a ransom to regain control over its data.

    It is not a question of if, but when, an organization will need to recover from a ‘successful’ ransomware attack. Therefore, it’s more important than ever to protect this data with immutable object storage and continuous data protection. Organizations should look for a storage solution that protects information continuously by taking snapshots as frequently as possible (e.g., every 90 seconds). That way, even when data is overwritten, older objects remain as part of the snapshot: the original data. That way, even when data is overwritten,there always will be another, immutable copy of the original objects that constitute the company’s data that can be instantly recovered… even if it’s hundreds of terabytes.

    Green storage

    Global data centers consume massive amounts of energy, which contributes to global warming. Data centers now eat up around 3% of the world’s electricity supply. They are responsible for approximately two percent of global greenhouse gas emissions. These numbers put the carbon footprint of data centers on par with the entire airline industry.

    Many companies are seeking to reduce their carbon footprint and be good corporate citizens. As part of this effort, they are increasingly looking for more environmentally-friendly storage solutions, those that can deliver the highest levels of performance and capacity at the lowest possible power consumption.

    In 2020, organizations of all sizes will work hard to get the most from the data they create and store. By leveraging these five trends and adopting a modern approach to data storage, organizations can more effectively transform their business and thrive in the digital economy.

    The ‘Prevention Era’ will be overtaken by the ‘Recovery Era’

    Organizations will have to look to more efficient and different ways to protect unstructured and structured data. An essential element to being prepared in the ‘recovery era’ will involve moving unstructured data to immutable object storage with remote replication, which will eliminate the need for traditional backup. The nightly backup will become a thing of the past, replaced by snapshots every 90 seconds. This approach will free up crucial primary storage budget, VMware/Hyper-V storage, and CPU/memory for critical servers.

    While data protection remains crucial, in the data recovery era, the sooner organizations adopt a restore and recover mentality, the better they will be able to benefit from successful business continuity strategies in 2020 and beyond.

    Author: Sean Derrington

    Source: Dataversity

  • The most important BI trends for 2020

    The most important Business Intelligence trends for 2020

    Companies are in the midst of many profound changes: The amount of data available and the speed of producing new data has been increasing rapidly for years, and business models as well as process improvements increasingly rely on data and analytics.

    Against this backdrop, a key challenge is emerging: the efficient and, at the same time, innovative use of data is only possible when capabilities for, and the operationalization of, both analytics and data management are ensured. Many companies are already reaching their limits with a ‘the more data the better‘ approach and cannot fully leverage the benefits they expect due to a lack of data quality or analytical skills.

    In addition, there has been an increased focus on data protection since the GDPR came into effect in 2018. Amid a huge flood of information, companies will have to find ways to handle data in a way that not only complies with legal requirements, but also helps to improve processes and make day-to-day business easier.

    This year we asked 2,865 users, consultants and vendors for their views on the most important BI trends. The BARC BI Trend Monitor 2020 illustrates which trends are currently regarded as important in addressing these challenges by a broad group of BI and analytics professionals. Their responses provide a comprehensive picture of regional, company and industry specific differences and offer up-to-the-minute insights into developments in the BI market and the future of BI. Our long-term comparisons also show how trends in business intelligence have developed, making it possible to separate hype from stable trends.

    BARC’s BI Trend Monitor 2020 reflects on the business intelligence and data management trends currently driving the BI market from a user perspective.

    Importance of Business Intelligence trends in 2020 (n=2,865)

    1. MD/MQ management. Importance (1-10 scale): 7.3
    2. Data discovery/visualization. Importance (1-10 scale): 6.9
    3. Establishing data-driven culture. Importance (1-10 scale): 6.9
    4. Data governance. Importance (1-10 scale): 6.8
    5. Self service BI. Importance (1-10 scale): 6.5
    6. Data prep. business users. Importance (1-10 scale): 6.3
    7. Data warehouse modernization. Importance (1-10 scale): 5.9 
    8. Agile BI development. Importance (1-10 scale): 5.8
    9. Real-time analytics. Importance (1-10 scale): 5.6
    10. Advanced analytics/ML/AI. Importance (1-10 scale): 5.5
    11. Big data analytics. Importance (1-10 scale): 5.5
    12. Integrated platforms BI/PM. Importance (1-10 scale): 5.2
    13. Embedded BI and analytics. Importance (1-10 scale): 5.1
    14. Data storytelling. Importance (1-10 scale): 5.1
    15. Mobile BI. Importance (1-10 scale): 5.1
    16. Analytics teams/data labs. Importance (1-10 scale): 5.0
    17. Using external/open data. Importance (1-10 scale): 4.9
    18. Cloud for data and analytics. Importance (1-10 scale): 4.9
    19. Data catalogs. Importance (1-10 scale): 4.2
    20. Process mining. Importance (1-10 scale): 4.1

    The most (and least) important BI trends in 2020

    We asked users, consultants and software vendors of BI and data management technology to give their personal rating of the importance of twenty trending topics that we presented to them.

    Data quality/master data management, data discovery/visualization and data-driven culture are the three topics BI practitioners identify as the most important trends in their work.

    At the other end of the spectrum, cloud for BI and analytics, data catalogs and process mining were voted as the least important of the twenty trends covered in BARC’s survey.

    What do these results tell us?

    While the two most important trends remained the same as last year with master data and data quality management in first position and data discovery in second, third spot is now occupied by establishing a data-driven culture. This trend, which was newly introduced last year and went straight into fifth place in the rankings, is seen as even more important this year. Self-service BI, on the other hand, went down to fifth place this year whereas data governance remains in fourth.

    All in all, these five top trends represent the foundation for organizations to manage their own data and make use of it. Furthermore, it demonstrates that organizations are aware of the relevance of high quality data and its effective use. These trends stand for underlying structures being changed: Organizations want to go beyond the collection of as much data as possible and actively use data to improve their business decisions. This is also supported by data warehouse modernization, which is once again in seventh place this year.

    Some trends have slightly increased in importance since last year (e.g., real-time analytics an integrated platforms for BI and PM). However, they all climbed just one rank with the exception of establishing a data-driven culture, which jumped two places. Therefore, no huge shift can be observed in terms of upward trends.

    The opposite is the case for downward trends: Mobile BI fell from twelfth to fifteenth place this year, continuing its downward trend that started in 2017. It seems as if the mobile application of BI functions is not seen as important anymore, either because it is available now or because requirements have shifted. Advanced analytics/machine learning/AI is ranked one place lower than last year (down from 9 to 10).

    More important than the difference of one rank however is the tendency behind this slight downward trend: In 2018, many hopes were based on new tools using machine learning and artificial intelligence so this topic might have been expected to rise. However, even if we refer to it as a stagnation in perceived importance rather than a 'real' downward trend, this result is surprising.

    Source: BI-Survey

  • Top artificial intelligence trends for 2020

    Top artificial intelligence trends for 2020

    Top AI trends for 2020 are increased automation to extend traditional RPA, deeper explainable AI with more natural language capacity, and better chips for AI on the edge.

    The AI trends 2020 landscape will be dominated by increasing automation, more explainable AI and natural language capabilities, better AI chips for AI on the edge, and more pairing of human workers with bots and other AI tools.

    AI trends 2020: increased automation

    In 2020, more organizations across many vertical industries will start automating their back-end processes with robotic process automation (RPA), or, if they are already using automation, increase the number of processes to automate.

    RPA is 'one of the areas where we are seeing the greatest amount of growth', said Mark Broome, chief data officer at Project Management Institute (PMI), a global nonprofit professional membership association for the project management profession.

    Citing a PMI report from summer 2019 that compiled survey data from 551 project managers, Broome said that now, some 21% of surveyed organizations have been affected by RPA. About 62% of those organizations expect RPA will have a moderate or high impact over the next few years.

    RPA is an older technology, organizations have used RPA for decades. It's starting to take off now, Broome said, partially because many enterprises are becoming aware of the technology.

    'It takes a long time for technologies to take hold, and it takes a while for people to even get trained on the technology', he said.

    Moreover, RPA is becoming more sophisticated, Broome said. Intelligent RPA or simply intelligent process automation (IPA), RPA infused with machine learning, is becoming popular, with major vendors such as Automation Anywhere and UiPath often touting their intelligent RPA products. With APIs and built-in capabilities, IPA enables users to more quickly and easily scale up their automation use cases or carry out more sophisticated tasks, such as automatically detecting objects on a screen, using technologies like optical character recognition (OCR) and natural language processing (NLP).

    Sheldon Fernandez, CEO of DarwinAI, an AI vendor focused on explainable AI, agreed that RPA platforms are becoming more sophisticated. More enterprises will start using RPA and IPA over the next few years, he said, but it will happen slowly.

    AI trends 2020: push toward explainable AI

    Even as AI and RPA become more sophisticated, there will be a bigger move toward more explainable AI.

    'You will see quite a bit of attention and technical work being done in the area of explainability across a number of verticals', Fernandez said.

    Users can expect two sets of effort behind explainable AI. First, vendors will make AI models more explainable for data scientists and technical users. Eventually, they will make models explainable to business users.

    Likely, technology vendors will move more to address problems of data bias as well, and to maintain more ethical AI practices.

    'As we head into 2020, we're seeing a debate emerge around the ethics and morality of AI that will grow into a highly contested topic in the coming year, as organizations seek new ways to remove bias in AI and establish ethical protocols in AI-driven decision-making', predicted Phani Nagarjuna, chief analytics officer at Sutherland, a process transformation vendor.

    AI trends 2020: natural language

    Furthermore, BI, analytics and AI platforms will likely get more natural language querying capabilities in 2020.

    NLP technology also will continue to evolve, predicted Sid Reddy, chief scientist and senior vice president at virtual assistant vendor Conversica.

    'Human language is complex, with hundreds of thousands of words, as well as constantly changing syntax, semantics and pragmatics and significant ambiguity that make understanding a challenge', Reddy said.

    'As part of the evolution of AI, NLP and deep learning will become very effective partners in processing and understanding language, as well as more clearly understanding its nuance and intent', he continued.

    Among the tech giants involved in AI, AWS for example, revealed Amazon Kendra in November 2019, an AI-driven search tool that will enable enterprise users to automatically index and search their business data. In 2020, enterprises can expect similar tools to be built into applications or sold as stand-alone products.

    More enterprises will deploy chatbots and conversational agents in 2020 as well, as the technology becomes cheaper, easier to deploy and more advanced. Organizations won't fully replace contact center employees with bots, however. Instead, they will pair human employees more effectively with bot workers, using bots to answer easy questions, while routing more difficult ones to their human counterparts.

    'There will be an increased emphasis in 2020 on human-machine collaboration', Fernandez said.

    AI trends 2020: better AI chips and AI at the edge

    To power all the enhanced machine learning and deep learning applications, better hardware is required. In 2020, enterprises can expect hardware that's specific to AI workloads, according to Fernandez.

    In the last few years, a number of vendors, including Intel and Google, released AI-specific chips and tensor processing units (TPUs). That will continue in 2020, as startups begin to enter the hardware space. Founded in 2016, the startup Cerebras, for example, unveiled a giant AI chip that made the news. The chip, the largest ever made, Cerebras claimed, is the size of a dinner plate and designed to power massive AI workloads. The vendor shipped some last year, with more expected to ship this year.

    While Cerebras may have created the largest chip in the world, 2020 will likely introduce smaller pieces of hardware as well, as more companies move to do AI at the edge.

    Max Versace, CEO and co-founder of neural network vendor Neurala, which specializes in AI technology for manufacturers, predicted that in 2020, many manufacturers will move toward the edge, and away from the cloud.

    'With AI and data becoming centralized, manufacturers are forced to pay massive fees to top cloud providers to access data that is keeping systems up and running', he said. 'As a result, new routes to training AI that can be deployed and refined at the edge will become more prevalent'.

    Author: Mark Labbe

    Source: TechTarget

  • What to expect for data governance in 2020?

    What to expect for data governance in 2020?

    Data governance always has been a complicated issue for most organizations. That won’t change in a big way in 2020. In fact, the increasing prevalence of technologies like artificial intelligence (AI) and machine learning (ML) may show up some of the pains even more. Don’t take that to mean that companies aren’t becoming more mature in their approach to Data Governance, though.

    AI, ML, the Internet of Things (IoT), and full process digitization will be a focus for organizations in 2020. Companies see them as required capabilities in the future and so are willing to invest in more digital innovation. 'This is expanding the governance lens and I’m seeing AI Governance becoming a reality in leading organizations', said Kelle O’Neal, founder and CEO of First San Francisco Partners. This trend shows that companies are seeing value in Data Governance so they’re extending successful practices into other areas of their business, she said.

    Organizations are realizing that AI is only successful when built upon a solid data foundation, thus driving the need for data governance, agreed Donna Burbank, managing director at Global Data Strategy:

    'I’ve had venture capital organizations approach us to train their AI startups in the foundations of data governance as a condition for investment', she said. 'I see that as an extremely positive sign pointing to the widespread need and adoption of data governance principles'.

    And yet poor data quality resulting from problems with data governance bedevils AI and ML outcomes and there’s no sign that that won’t be the case next year too.

    'Artificial intelligence and machine learning have been way oversold. Data quality gets in the way of getting good results and organizations spend way, way more time cleaning things up', said Thomas C. Redman, Ph.D., 'the Data Doc' and President of Data Quality Solutions. He estimates that more than 80% of AI and ML programs continue to fail because of this.

    Governance defined …Yet?

    One question that many companies will continue to grapple with in the new year is figuring out just what data governance is. In simple terms, said Redman, it’s a senior oversight function whose leaders advise the board or senior management about whether a data-related program is designed in the best interest of the company and is operating as designed. And as he sees it, no one is doing that yet.

    'There’s all talk about data as the most important asset, but having that oversight level would be essential if that statement were correct', he said. It’s not about plugging in various tools but about thinking of just what data governance is … and what it isn’t:

    'The term ‘governance’ is being used for everything from moving data from here to there to something about how you operate analytics. That’s not the proper use of the term'.

    Getting roles and responsibilities right is critical, he said. Data governance should be business-led and IT supported, Burbank remarked: 

    'All areas of the business need to have accountability for the data in their domain and establishing data stewardship roles is critical to ensuring accountability at all levels of the organization from strategic to tactical'.

    Chief Data Officer (CDO) roles are becoming more common, and the office of the CDO does best when it reports up through a business function like operations, strategy, or shared services, said O’Neal, or even finance if that team is influential in driving enterprise programs that result in corporate growth.

    Organizations that have matured their data governance practices will grow from a program culture to a data culture, which is one:

    'Where new employees start learning about data governance as part of their new-hire training, and data governance and management are part of the conversation at the board level', said O’Neal.

    What will data governance look like in 2020?

    It’s true that there haven’t been drastic changes in how far we’ve come with data governance over the past year, but O’Neal finds that companies are showing progress:

    'More and more companies are moving from ‘what is data governance and why should I do it,’ past creating a strategy, into not just implementation but also operationalization, where their data governance is really embedded with other project, decision-making, and ‘business as usual’ operations', she said.

    In terms of a formal, structured approach, the DAMA DMBoK is gaining wide acceptance, which is a positive step in aligning best practices, Burbank said:

    'While data governance is certainly not a ‘cookie cutter’ approach that can be simply taken from a book, the DMBOK does offer a good foundation on which organizations can build and customize to align with their own unique organizational needs and culture'.

    In 2019, Global Data Strategy supported data governance for a diverse array of sectors, including social services, education, manufacturing, insurance, building, and construction. 'It’s no longer just the traditional sectors like finance who understand the value of data', she said.

    Big value in small wins

    It’s really hard to impose Data Governance frameworks on big data at enterprise scale. It is better to start with small data first and Redman is optimistic that more companies will do so in 2020.

    'Practically everyone sees the logic in small data projects', he said. 'Suppose that only half of a hundred small data projects succeed, that’s a huge number of wins', with positive implications for cost savings and improvements in areas like customer service. And solving more of these leads to learning about what it takes to solve big data problems. 'If you build the organizational muscle you need doing small data projects you can tackle big data projects'.

    Following the classic rule of thinking big and starting small in order to have the proper data governance framework and foundation in place is what works, Burbank said. Establishing small 'quick wins' shows continual value across the organization.

    Tools to help

    2018 saw astounding growth in the data catalog market, O’Neal said. Data catalogs provide information about each piece of data, such as location of entities and data lineage. So, if you haven’t thought about that yet, it’s time to do that this year, she said.

    The good news is that the modern tools for Metadata Management and data cataloguing are much more user-friendly and approachable, according to Burbank:

    'Which is a great advancement for giving business users self-service capability and accountability for metadata and governance'.

    Redman noted that 'you can love your data governance tools, and I do too. But if you approach the problem wrong it doesn’t matter what tools you have'.

    What’s up next

    In 2020, the organizations that are able to get their own data governance in order will reach out to others in the industry to establish cross-organization data governance and data sharing agreements:

    'For example, organizations in the social services or medical arena are looking to provide cohesive support for individuals across organizations that provide the best level of service, while at the same time protecting privacy', Burbank said. 'It’s an interesting challenge, and an area of growth and opportunity in the data governance space'.

    There’s an opportunity this year for companies that are moderately mature in data governance to think about how to embed practices in business processes and decision-making structure of the organization. Places to look for embedment opportunities, O’Neal commented, are new project initiation and project management, investment approval and funding, customer creation and on-boarding, product development and launch, and vendor management/procurement.

    Expect data analytics and BI to continue to be large drivers for data governance:

    'As more organizations want to become data-driven', Burbank said, 'they are realizing that the dashboards used to drive business decision-making must be well-governed and well-understood with full data lineage, metadata definitions, and so on'.

    Author: Jennifer Zaino

    Source: Dataversity

  • What to expect from data decade 2020-2030?

    What to expect from data decade 2020-2030?

    From wild speculation that flying cars will become the norm to robots that will be able to tend to our every need, there is lots of buzz about how AI, Machine Learning, and Deep Learning will change our lives. However, at present, it seems like a far-fetched future. 

    As we enter the 2020s, there will be significant progress in the march towards the democratization of data that will fuel some significant changes. Gartner identified democratization as one of its top ten strategic technology trends for the enterprise in 2020 and this shift in ownership of data means that anyone can use the information at any time to make decisions.

    The democratization of data is frequently referred to as citizen access to data. The goal is to remove any barriers to access or understand data. With the explosion in information generated by the IoT, Machine Learning, AI, coupled with digital transformation, it will result in substantial changes in not only the volume of data but the way we process and use this intelligence.

    Here are  four predictions that we can expect to see in the near future:

    1. Medical records will be owned by the individual

    Over the last decade, medical records have moved from paper to digital. However, they are still fragmented, with multiple different healthcare providers owning different parts. This has generated a vast array of inefficiencies. As a result, new legislation will come into effect before the end of 2023 that will allow people to own their health records rather than doctors or health insurance companies.  

    This law will enable individuals to control access to their medical records and only share it when they decide. By owning your health golden data record, all of the information will be in one centralized place, allowing those providers that you share this information with to make fully informed decisions that are in your best interest. Individuals will now have the power to determine who can view their health records and this will take the form of a digital twin of your files. When you visit a doctor, you will take this health record with you and check it in with the health provider and when you check out, the provider will be required to delete your digital footprint. 

    When you select medication at CVS, for example, the pharmacist will be able to scan your smart device to see what meds you are taking and other health indicators and then advise if the drug you selected is optimal for you. This will shift the way we approach healthcare from a reactive to a personalized preventative philosophy. Google has already started on this path with its project Nightingale initiative with the goal of using data machine learning and AI to suggest changes to individual patents care. By separating the data from the platform, it will also, in turn, fuel a whole new set of healthcare startups driven by predictive analytics that will, in time, change the entire dynamics of the healthcare insurance market. This will usher in a new era of healthcare that will move towards the predictive maintenance of humans, killing the established health insurance industry as we know it. Many of the incumbent healthcare giants will have to rethink their business model completely. However, what form this will take is currently murky. 

    2. Employee analytics will be regulated

    An algorithm learns based on the data provided, so if it’s fed with a biased data set, it will give biased recommendations. This inherent bias in AI will see new legislation introduced to prevent discrimination. The regulation will put the onus on employers to ensure that their algorithms are not prejudiced and that the same ethics that they have in the physical world also apply in the digital realm. As employee analytics determine pay raises, performance bonuses, promotions, and hiring decisions, this legislation will ensure a level playing field for all. As this trend evolves, employees will control their data footprint, and when they leave an organization rather than clearing out their physical workspace, they will take their data footprint with them.

    3. Edge computing: from niche to mainstream

    Edge computing is dramatically changing the way data is stored and processed. The rise of IoT, serverless apps, peer2peer, and the plethora of streaming services will continue to fuel the exponential growth of data. This, coupled with the introduction of 5G, will deliver faster networking speed enabling edge computing to process and store data faster to support critical real-time applications like autonomous vehicles and location services. As a result of these changes, by the end of 2021, more data will be processed at the edge than in the cloud. The continued explosive growth in the volume of data coupled with faster networking will drive edge computing systems from niche to mainstream as data will shift from predominantly being processed in the cloud to the edge.

    4. Machine unlearning will become important

    With the rise in intelligent automation, 2020 will see the rise of machine unlearning. As the volume of data sets continues to grow rapidly, knowing what learning to follow and what to ignore will be another essential aspect of intelligent data. Humans have a well-developed ability to unlearn information; however, machines currently are not good at this and are only able to learn incrementally. Software has to be able to ignore information that prevents it from making optimal decisions rather than repeating the same mistakes. As the decade progresses, machine unlearning where systems unlearn digital assets will become essential in order to develop secure AI-based systems.

    As the democratization of intelligent data becomes a reality, it will ultimately create a desirable, egalitarian end-state where all decisions are data-driven. This shift, however, will change the dynamics of many established industries and make it easier for smaller businesses to compete with large established brands. Organizations must anticipate these changes and rethink how they process and use intelligent data to ensure that they remain relevant in the next decade and beyond.

    Author: Antony Edwards

    Source: Dataconomy

EasyTagCloud v2.8