13 items tagged "IoT"

  • ‘Privacy als integraal onderdeel van security’

    Experts doen voorspellingen voor 2016

    4726367In 2016 staat veel te gebeuren op het vlak van privacy en dataprotectie. Hoewel de discussie ‘security versus privacy’ dit jaar al is begonnen, barst deze in 2016 echt los. Computable-experts bespreken verschillende it-beveiligingsvoorspellingen voor het komende jaar. Hierbij worden onder andere cloudbeveiliging en de integratie van beveiligingsoplossingen aangehaald.

    Privacy
    Richard van Lent, managing partner bij MITE Systems:
    2015 stond vooral in het teken van onderwerpen als ‘Cloud First’, ‘Mobile First’ & internet of things (IoT). Daarnaast heeft de Wet Meldplicht Datalekken, een toevoeging op de bestaande Wbp, de afgelopen maanden nogal wat stof doen opwaaien binnen afdelingen als hr, legal/risk compliance, security en ict. Data Analytics-oplossingen gaan voor de meeste bedrijven rand voorwaardelijk worden als het gaat om het borgen van de privacy, compliancy en security in de dagelijkse operatie. Een belangrijke trend voor 2016, die zich momenteel in een rap tempo ontwikkelt, is dan ook het inzetten van Data Analytics-oplossingen om zaken als beveiliging, voorspelbaarheid & gedrag bijna in realtime inzichtelijk te maken.

    Gerard Stroeve, manager Security & Continuity Services bij Centric:
    Een andere belangrijke security-trend voor 2016 is privacy. Op het gebied van privacy staat er het komende jaar veel te gebeuren. Neem bijvoorbeeld de meldplicht datalekken. Deze gaat op 1 januari 2016 in en verplicht organisaties (zowel bedrijven als overheden) om ernstige datalekken direct te melden. Ook krijgt het College Bescherming Persoonsgegevens (CBP), dat vanaf 1 januari verder gaat als de Autoriteit Persoonsgegevens, boetebevoegdheid tot 820.000 euro. Daarnaast vormt de Europese Algemene Data Protectie Verordening een belangrijke internationale ontwikkeling op het gebied van privacy. Naar verwachting wordt ook deze begin 2016 van kracht met een overgangsperiode van zo’n anderhalf jaar. Komend jaar moeten organisaties echt aan de slag met de voorbereiding op die nieuwe wetgeving.

    Het is daarbij belangrijk om privacy niet als een op zichzelf staand onderwerp te benaderen. Privacy heeft specifieke wet- en regelgeving, maar is wel een integraal onderdeel van informatiemanagement en -beveiliging als geheel.

    Lex Borger, Principal Consultant bij I-to-I:
    De politieke discussie ‘security versus privacy’ is al begonnen, maar gaat in het verkiezingsjaar in de Verenigde Staten echt losbarsten. Hoeveel privacy moeten we opgeven om terrorisme te bestrijden? Mogen overheden eisen stellen om toegang te krijgen tot informatie die versleuteld verzonden wordt of opgeslagen is? Kan terrorisme zo bestreden worden? Kunnen we overheden vertrouwen met die mogelijkheden? Kunnen we voorkomen dat anderen (overheden, georganiseerde criminaliteit) hier misbruik van maken? Mag de burger nog wat te verbergen hebben? Voldoende gelegenheid tot discussie, ik ben benieuwd.

    Cloud, integratie en IoT
    Gerard Stroeve, manager Security & Continuity Services, Centric:
    De derde belangrijke trend voor 2016 is cloudsecurity. Veel organisaties geven aan nog te weinig grip te hebben op de cloudoplossingen die zij gebruiken. Hoe vind je bijvoorbeeld afstemming met grote, publieke cloudleveranciers als Google, Microsoft en Amazon? Een andere grote uitdaging vormt de zogenaamde ‘Shadow it, een fenomeen waarmee bijna iedere organisatie tegenwoordig te maken heeft. Wanneer de juiste functionaliteit niet of niet snel genoeg beschikbaar is, zoeken medewerkers daarvoor steeds vaker zelf een oplossing in de cloud. Zo ontstaat er langzaam maar zeker een wildgroei aan gebruikte cloudoplossingen waarop de organisatie geen grip heeft. Dat vraagt om helder beleid en goede richtlijnen rondom veilig cloudgebruik, zonder daarbij de productiviteit in de weg te zitten. Een praktisch handvat hierbij vormt de 4C-benadering.

    John Veldhuis, Senior System Consultant bij Sophos:
    Zoals we in de gateway een integratie hebben gezien van voorheen losstaande apparaten voor email/web proxy, vpn, packet filtering, layer 7 filtering, load balancing et cetera. in een UTM, gaan we een doorontwikkeling meemaken die ervoor zorgt dat anti-malware, UTM en versleutelingsoplossingen met elkaar praten. Voorbeeld: Verdacht verkeer, bijvoorbeeld ransomware gerelateerd, wordt gedetecteerd. Dit kan door software op de besmette machine zijn, maar ook door de UTM. Hierdoor kan automatisch:

    • de machine ontdaan worden van sleutels, zodat vertrouwelijke data niet gelekt kan worden, maar ook het bewerken ervan (versleutelen door ransomware) niet meer kan plaatsvinden (geen sleutel = geen toegang);
    • de machine in een quarantaine- of mitigatienetwerk worden geplaatst, op de overige machines een zoekactie worden gestart naar het bestand dat het verkeer veroorzaakte de herkomst van het bestand worden gezocht en in een reputatiefilter worden gezet et cetera.

    Harm de Haan manager consultancy bij Telindus:
    Door de toenemende complexiteit van it-infrastructuren en de mate waarin onderdelen met elkaar geïntegreerd zijn, is security steeds moeilijker te garanderen. Bovendien zijn de gevolgen van een ´breach´ groter, door de nieuwe Wet Meldplicht Datalekken. Hierdoor is security voor veel organisaties een belangrijk thema in 2016. Endpoint solutions kunnen veiligheid onvoldoende garanderen, juist door de complexiteit van moderne infrastructuren. Beter zouden organisaties zich oriënteren op ‘security by design’: een aanpak waarbij security in de ontwerpfase van een infrastructuur wordt ingericht als onderdeel van de losse componenten, zoals compute, networking en storage.

    Lex Borger, Principal Consultant bij I-to-I:
    Geen excuus meer: TLlskan nu overal en gratis. Ttp's moeten hun business-model veranderen, Let's Encrypt levert gratis tls-certificaten met automatische provisioning. Jouw site kan altijd https aan. Geen dure certificaten te beheren, geen moeilijk proces voor aanvraag, maar je moet wel aantonen dat je zeggenschap hebt over je website. Voor velen zal dit veilig genoeg zijn. Het betaalde proces voor certificaatbeheer is complex genoeg, het is absurd dat er nog gepleit wordt om SHA-1 langer geldig te laten zijn omdat we onze certificaten niet snel genoeg kunnen opsporen en vervangen.

    Ook zal het internet of things in 2016 explosief groeien met sensors en kleine automaatjes die eenvoudig moeten werken en simpel aan te sluiten zijn. Dit geeft heel veel mogelijkheden voor hackers om ik weet niet wat te doen (Ik ben niet creatief genoeg om te bedenken wat er allemaal mogelijk is). We gaan er flink last van krijgen dat onze netwerkinfrastructuren inherent onveilig zijn. Het is nog steeds eenvoudig je voor te doen als een ander apparaat op het internet. Dit gegeven ligt aan de basis van veel aanvallen - DDoS, identity theft.

    Tot slot: een gedegen basisproces
    Los van deze drie thema’s vraagt informatiebeveiliging vooral een integrale benadering, meent Gerard Stroeve. ‘Informatiebeveiliging is een breed vakgebied met verschillende aandachtsgebieden. Naast de genoemde thema’s, is er bijvoorbeeld cybersecurity. Het aantal DDoS- of ransomware-aanvallen zal komend jaar niet afnemen. Ook verwachten we dat de aandacht voor beschikbaarheid en continuïteitsmanagement het komend jaar zal toenemen.’

    Om effectief met al deze uiteenlopende dreigingen om te gaan, is een gedegen basisproces volgens hem cruciaal. ‘Door een goed governancemodel ben je in staat te reageren op nieuwe en veranderende dreigingen. Hierbij staan classificatie en het analyseren van risico’s centraal. Belangrijk is ook dat informatiebeveiliging een stevig managementdraagvlak krijgt.’

    Source: Computable

  • 8 op de 10 bedrijven slaat gevoelige data op in de cloud

    54640085% van de bedrijven slaat gevoelige data op in de cloud. Dit is een flinke stijging ten opzichte van de 54% die vorig jaar aangaf dit te doen. 70% van de bedrijven maakt zich zorgen over de veiligheid van deze data.

    Dit blijkt uit onderzoek van 451 Research in opdracht van Vormetric, leverancier van databeveiliging voor fysieke, big data, public, private en hybride cloud omgevingen. Gevoelige data staat uiteraard niet alleen in de cloud. 50% van de bedrijven geeft aan gevoelige data in big data systemen te hebben staan (tegenover 31% vorig jaar), en 33% heeft dergelijke data in Internet of Things (IoT) omgevingen opgeslagen.

    Zorgen over de cloud
    451 Research heeft respondenten ook gevraagd naar de zorgen die zij hebben over de veiligheid van hun gevoelige data die in de cloud staat. De belangrijkste zorgenpunten zijn:

    • Cyberaanvallen en -inbraken bij een service provider (70%)
    • De kwetsbaarheid van een gedeelde infrastructuur (66%)
    • Een gebrek aan controle over de locatie waar data is opgeslagen (66%)
    • Een gebrek aan een data privacy beleid of privacy SLA (65%)

    Ook is respondenten gevraagd welke wijzigingen hun bereidheid data in de cloud onder te brengen zullen vergroten. De belangrijkste wijzigingen waar respondenten behoefte aan hebben zijn:

    • Encryptie van data, waarbij de encryptiesleutel wordt beheerd op de eigen infrastructuur van het bedrijf (48%)
    • Gedetaileerde informatie over de fysieke en IT-beveiliging (36%)
    • Het zelf kunnen kiezen voor encryptie van data die is opgeslagen op de infrastructuur van een service provider (35%)

    Zorgen over big data systemen
    Ook de opslag van gevoelige data in big data systemen baart respondenten zorgen. De belangrijkste zorgenpunten zijn:

    • De veiligheid van rapporten die met big data systemen worden gecreëerd, aangezien deze gevoelige data kunnen bevatten (42%)
    • Het feit dat data op iedere locatie binnen deze omgeving kan zijn ondergebracht (41%)
    • Privacyschendingen door data die uit verschillende landen afkomstig is (40%)Toegang door gebruikers met ‘superrechten’ tot beschermde data (37%)
    • Een gebrek aan een security raamwerk en beheermogelijkheden binnen de omgeving (33%)

    Ook merkt 451 Research op dat big data systemen vaak in de cloud draaien. Zorgen over de opslag van gevoelige data van de cloud zijn hierdoor ook van toepassing op data die in big data omgevingen is opgeslagen.

    Ook data in IoT omgevingen leidt tot zorgen
    Tot slot kijkt 451 Research naar de zorgen die bedrijven hebben over de opslag van data in IoT omgevingen. De belangrijkste zorgen op dit gebied zijn:

    • Het beschermen van data die door IoT wordt gecreëerd (35%)
    • Privacyschendingen (30%)
    • Identificeren welke data gevoelig is (29%)
    • Toegang van gebruikers met ‘superrechten’ tot IoT data en apparaten (28%)
    • Aanvallen op IoT-apparaten die een impact kunnen hebben op de kritieke bedrijfsvoering (27%)

    Het gehele onderzoek lees je HIER

    Source: Executive People

  • Artificial intelligence: Can Watson save IBM?

    160104-Cloud-800x445The history of artificial intelligence has been marked by seemingly revolutionary moments — breakthroughs that promised to bring what had until then been regarded as human-like capabilities to machines. The AI highlights reel includes the “expert systems” of the 1980s and Deep Blue, IBM’s world champion-defeating chess computer of the 1990s, as well as more recent feats like the Google system that taught itself what cats look like by watching YouTube videos.

    But turning these clever party tricks into practical systems has never been easy. Most were developed to showcase a new computing technique by tackling only a very narrow set of problems, says Oren Etzioni, head of the AI lab set up by Microsoft co-founder Paul Allen. Putting them to work on a broader set of issues presents a much deeper set of challenges.
    Few technologies have attracted the sort of claims that IBM has made for Watson, the computer system on which it has pinned its hopes for carrying AI into the general business world. Named after Thomas Watson Sr, the chief executive who built the modern IBM, the system first saw the light of day five years ago, when it beat two human champions on an American question-and-answer TV game show, Jeopardy!
    But turning Watson into a practical tool in business has not been straightforward. After setting out to use it to solve hard problems beyond the scope of other computers, IBM in 2014 adapted its approach.
    Rather than just selling Watson as a single system, its capabilities were broken down into different components: each of these can now be rented to solve a particular business problem, a set of 40 different products such as language-recognition services that amount to a less ambitious but more pragmatic application of an expanding set of technologies.
    Though it does not disclose the performance of Watson separately, IBM says the idea has caught fire. John Kelly, an IBM senior vice-president and head of research, says the system has become “the biggest, most important thing I’ve seen in my career” and is IBM’s fastest growing new business in terms of revenues.
    But critics say that what IBM now sells under the Watson name has little to do with the original Jeopardy!-playing computer, and that the brand is being used to create a halo effect for a set of technologies that are not as revolutionary as claimed.

    “Their approach is bound to backfire,” says Mr Etzioni. “A more responsible approach is to be upfront about what a system can and can’t do, rather than surround it with a cloud of hype.”
    Nothing that IBM has done in the past five years shows it has succeeded in using the core technology behind the original Watson demonstration to crack real-world problems, he says.

    Watson’s case
    The debate over Watson’s capabilities is more than just an academic exercise. With much of IBM’s traditional IT business shrinking as customers move to newer cloud technologies, Watson has come to play an outsized role in the company’s efforts to prove that it is still relevant in the modern business world. That has made it key to the survival of Ginni Rometty, the chief executive who, four years after taking over, is struggling to turn round the company.
    Watson’s renown is still closely tied to its success on Jeopardy! “It’s something everybody thought was ridiculously impossible,” says Kris Hammond, a computer science professor at Northwestern University. “What it’s doing is counter to what we think of as machines. It’s doing something that’s remarkably human.”

    By divining the meaning of cryptically worded questions and finding answers in its general knowledge database, Watson showed an ability to understand natural language, one of the hardest problems for a computer to crack. The demonstration seemed to point to a time when computers would “understand” complex information and converse with people about it, replicating and eventually surpassing most forms of human expertise.
    The biggest challenge for IBM has been to apply this ability to complex bodies of information beyond the narrow confines of the game show and come up with meaningful answers. For some customers, this has turned out to be much harder than expected.
    The University of Texas’s MD Anderson Cancer Center began trying to train the system three years ago to discern patients’ symptoms so that doctors could make better diagnoses and plan treatments.
    “It’s not where I thought it would go. We’re nowhere near the end,” says Lynda Chin, head of innovation at the University of Texas’ medical system. “This is very, very difficult.” Turning a word game-playing computer into an expert on oncology overnight is as unlikely as it sounds, she says.

    Part of the problem lies in digesting real-world information: reading and understanding reams of doctors’ notes that are hard for a computer to ingest and organise. But there is also a deeper epistemological problem. “On Jeopardy! there’s a right answer to the question,” says Ms Chin but, in the
    medical world, there are often just well-informed opinions.
    Mr Kelly denies IBM underestimated how hard challenges like this would be and says a number of medical organisations are on the brink of bringing similar diagnostic systems online.


    Applying the technology
    IBM’s initial plan was to apply Watson to extremely hard problems, announcing in early press releases “moonshot” projects to “end cancer” and accelerate the development of Africa. Some of the promises evaporated almost as soon as the ink on the press releases had dried. For instance, a far-reaching partnership with Citibank to explore using Watson across a wide range of the bank’s activities, quickly came to nothing.
    Since adapting in 2014, IBM now sells some services under the Watson brand. Available through APIs, or programming “hooks” that make them available as individual computing components, they include sentiment analysis — trawling information like a collection of tweets to assess mood — and personality tracking, which measures a person’s online output using 52 different characteristics to come up with a verdict.

    At the back of their minds, most customers still have some ambitious “moonshot” project they hope that the full power of Watson will one day be able to solve, says Mr Kelly; but they are motivated in the short term by making improvements to their business, which he says can still be significant.
    This more pragmatic formula, which puts off solving the really big problems to another day, is starting to pay dividends for IBM. Companies like Australian energy group Woodside are using Watson’s language capabilities as a form of advanced search engine to trawl their internal “knowledge bases”. After feeding more than 20,000 documents from 30 years of projects into the system, the company’s engineers can now use it to draw on past expertise, like calculating the maximum pressure that can be used in a particular pipeline.
    To critics in the AI world, the new, componentised Watson has little to do with the original breakthrough and waters down the technology. “It feels like they’re putting a lot of things under the Watson brand name — but it isn’t Watson,” says Mr Hammond.
    Mr Etzioni goes further, claiming that IBM has done nothing to show that its original Jeopardy!-playing breakthrough can yield results in the real world. “We have no evidence that IBM is able to take that narrow success and replicate it in broader settings,” he says. Of the box of tricks that is now sold under the Watson name, he adds: “I’m not aware of a single, super-exciting app.”

    To IBM, though, such complaints are beside the point. “Everything we brand Watson analytics is very high-end AI,” says Mr Kelly, involving “machine learning and high-speed unstructured data”. Five years after Jeopardy! the system has evolved far beyond its original set of tricks, adding capabilities such as image recognition to expand greatly the range of real-world information it can consume and process.


    Adopting the system
    This argument may not matter much if the Watson brand lives up to its promise. It could be self-fulfilling if a number of early customers adopt the technology and put in the work to train the system to work in their industries, something that would progressively extend its capabilities.

    Another challenge for early users of Watson has been knowing how much trust to put in the answers the system produces. Its probabilistic approach makes it very human-like, says Ms Chin at MD Anderson. Having been trained by experts, it tends to make the kind of judgments that a human would, with the biases that implies.
    In the business world, a brilliant machine that throws out an answer
    to a problem but cannot explain itself will be of little use, says Mr Hammond. “If you walk into a CEO’s office and say we need to shut down three factories and sack people, the first thing the CEO will say is: ‘Why?’” He adds: “Just producing a result isn’t enough.”
    IBM’s attempts to make the system more transparent, for instance by using a visualisation tool called WatsonPaths to give a sense of how it reached a conclusion, have not gone far enough, he adds.
    Mr Kelly says a full audit trail of Watson’s decision-making is embedded in the system, even if it takes a sophisticated user to understand it. “We can go back and figure out what data points Watson connected” to reach its answer, he says.

    He also contrasts IBM with other technology companies like Google and Facebook, which are using AI to enhance their own services or make their advertising systems more effective. IBM is alone in trying to make the technology more transparent to the business world, he argues: “We’re probably the only ones to open up the black box.”
    Even after the frustrations of wrestling with Watson, customers like MD Anderson still believe it is better to be in at the beginning of a new technology.
    “I am still convinced that the capability can be developed to what we thought,” says Ms Chin. Using the technology to put the reasoning capabilities of the world’s oncology experts into the hands of other doctors could be far-reaching: “The way Amazon did for retail and shopping, it will change what care delivery looks like.”
    Ms Chin adds that Watson will not be the only reasoning engine that is deployed in the transformation of healthcare information. Other technologies will be needed to complement it, she says.
    Five years after Watson’s game show gimmick, IBM has finally succeeded in stirring up hopes of an AI revolution in business. Now, it just has to live up to the promises.

    Source: Financial Times

  • Big Data Predictions for 2016

    A roundup of big data and analytics predictions and pontifications from several industry prognosticators.

    At the end of each year, PR folks from different companies in the analytics industry send me predictions from their executives on what the next year holds. This year, I received a total of 60 predictions from a record 17 companies. I can't laundry-list them all, but I can and did put them in a spreadsheet (irony acknowledged) to determine the broad categories many of them fall in. And the bigger of those categories provide a nice structure to discuss many of the predictions in the batch.

    Predictions streaming in
    MapR CEO John Shroeder, whose company just added its own MapR Streams component to its Hadoop distribution, says "Converged Approaches [will] Become Mainstream" in 2016. By "converged," Schroeder is alluding to the simultaneous use of operational and analytical technologies. He explains that "this convergence speeds the 'data to action' cycle for organizations and removes the time lag between analytics and business impact."

    The so-called "Lambda Architecture" focuses on this same combination of transactional and analytical processing, though MapR would likely point out that a "converged" architecture co-locates the technologies and avoids Lambda's approach of tying the separate technologies together.

    Whether integrated or converged, Phu Hoang, the CEO of DataTorrent predicts 2016 will bring an ROI focus to streaming technologies, which he summarizes as "greater enterprise adoption of streaming analytics with quantified results." Hoang explains that "while lots of companies have already accepted that real-time streaming is valuable, we'll see users looking to take it one step further to quantify their streaming use cases."

    Which industries will take charge here? Hoang says "FinTech, AdTech and Telco lead the way in streaming analytics." That makes sense, but I think heavy industry is, and will be, in a leadership position here as well.

    In fact, some in the industry believe that just about everyone will formulate a streaming data strategy next year. One of those is Anand Venugopal of Impetus Technologies, who I spoke with earlier this month. Venugopa, in fact, feels that we are within two years of streaming data becoming looked upon just another data source.

    Internet of predicted things
    It probably won't shock you that the Internet of Things (IoT) was a big theme in this year's round of predictions. Quentin Gallivan, Pentaho's CEO, frames the thoughts nicely with this observation: "Internet of Things is getting real!" Adam Wray, CEO at Basho, quips that "organizations will be seeking database solutions that are optimized for the different types of IoT data." That might sound a bit self-serving, but Wray justifies this by reasoning that this will be driven by the need to "make managing the mix of data types less operationally complex." That sounds fair to me.

    Snehal Antani, CTO at Splunk, predicts that "Industrial IoT will fundamentally disrupt the asset intelligence industry." Suresh Vasudevan, the CEO of Nimble Storage proclaims "in 2016 the IoT invades the datacenter." That may be, but IoT technologies are far from standardized, and that's a barrier to entry for the datacenter. Maybe that's why the folks at DataArt say "the IoT industry will [see] a year of competition, as platforms strive for supremacy." Maybe the data center invasion will come in 2017, then.

    Otto Berkes, CTO at CA Technologies, asserts that "Bitcoin-born Blockchain shows it can be the storage of choice for sensors and IoT." I hardly fancy myself an expert on blockchain technology, so I asked CA for a little more explanation around this one. A gracious reply came back, explaining that "IoT devices using this approach can transact directly and securely with each other...such a peer-to-peer configuration can eliminate potential bottlenecks and vulnerabilities." That helped a bit, and it incidentally shines a light on just how early-stage IoT technology still is, with respect to security and distributed processing efficiencies.

    Growing up
    Though admittedly broad, the category with the most predictions centered on the theme of value and maturity in Big Data products supplanting the fascination with new features and products. Essentially, value and maturity are proxies for the enterprise-readiness of Big Data platforms.

    Pentaho's Gallivan says that "the cool stuff is getting ready for prime time." MapR's Schroeder predicts "Shiny Object Syndrome Gives Way to Increased Focus on Fundamental Value," and qualifies that by saying "...companies will increasingly recognize the attraction of software that results in business impact, rather than focusing on raw big data technologies." In a related item, Schroeder predicts "Markets Experience a Flight to Quality," further stating that "...investors and organizations will turn away from volatile companies that have frequently pivoted in their business models."

    Sean Ma, Trifacta's Director of Product Management, looking at the manageability and tooling side of maturity, predicts that "Increasing the amount of deployments will force vendors to focus their efforts on building and marketing management tools." He adds: "Much of the capabilities in these tools...will need to replicate functionality in analogous tools from the enterprise data warehouse space, specifically in the metadata management and workflow orchestration." That's a pretty bold prediction, and Ma's confidence in it may indicate that Trifacta has something planned in this space. But even if not, he's absolutely right that this functionality is needed in the Big Data world. In terms of manageability, Big Data tooling needs to achieve not just parity with data warehousing and BI tools, but needs to surpass that level.

    The folks at Signals say "Technology is Rising to the Occasion" and explain that "advances in artificial intelligence and an understanding [of] how people work with data is easing the collaboration between humans and machines necessary to find meaning in big data." I'm not sure if that is a prediction, or just wishful thinking, but it certainly is the way things ought to be. With all the advances we've made in analyzing data using machine learning and intelligence, we've left the process of sifting through the output a largely manual process.

    Finally, Mike Maciag, the COO at AltiScale, asserts this forward-looking headline: "Industry standards for Hadoop solidify." Maciag backs up his assertion by pointing to the Open Data Platform initiative (ODPi) and its work to standardize Hadoop distributions across vendors. ODPi was originally anchored by Hortonworks, with numerous other companies, including AltiScale, IBM and Pivotal, jumping on board. The organization is now managed under the auspices of the Linux Foundation.

    Artificial flavor
    Artificial Intelligence (AI) and Machine Learning (ML) figured prominently in this year's predictions as well. Splunk's Antani reasons that "Machine learning will drastically reduce the time spent analyzing and escalating events among organizations." But Lukas Biewald, Founder and CEO of Crowdflower insists that "machines will automate parts of jobs -- not entire jobs." These two predictions are not actually contradictory. I offer both of them, though, to point out that AI can be a tool without being a threat.

    Be that as it may, Biewald also asserts that "AI will significantly change the business models of companies today." He expands on this by saying "legacy companies that aren't very profitable and possess large data sets may become more valuable and attractive acquisition targets than ever." In other words, if companies found gold in their patent portfolios previously, they may find more in their data sets, as other companies acquire them to further their efforts in AI, ML and predictive modeling.

    And more
    These four categories were the biggest among all the predictions but not the only ones, to be sure. Predictions around cloud, self-service, flash storage and the increasing prominence of the Chief Data Officer were in the mix as well. A number of predictions that stood on their own were there too, speaking to issues as far-reaching as salaries for Hadoop admins to open source, open data and container technology.

    What's clear from almost all the predictions, though, is that the market is starting to take basic big data technology as a given, and is looking towards next-generation integration, functionality, intelligence, manageability and stability. This implies that customers will demand certain baseline data and analytics functionality to be part of most technology solutions going forwards. And that's a great sign for everyone involved in Big Data.

    Source: ZDNet

     

  • Big data vendors see the internet of things (IoT) opportunity, pivot tech and message to compete

    waterfall-stream-over-bouldersOpen source big data technologies like Hadoop have done much to begin the transformation of analytics. We're moving from expensive and specialist analytics teams towards an environment in which processes, workflows, and decision-making throughout an organisation can - in theory at least - become usefully data-driven. Established providers of analytics, BI and data warehouse technologies liberally sprinkle Hadoop, Spark and other cool project names throughout their products, delivering real advantages and real cost-savings, as well as grabbing some of the Hadoop glow for themselves. Startups, often closely associated with shepherding one of the newer open source projects, also compete for mindshare and custom.

    And the opportunity is big. Hortonworks, for example, has described the global big data market as a $50 billion opportunity. But that pales into insignificance next to what Hortonworks (again) describes as a $1.7 trillion opportunity. Other companies and analysts have their own numbers, which do differ, but the step-change is clear and significant. Hadoop, and the vendors gravitating to that community, mostly address 'data at rest'; data that has already been collected from some process or interaction or query. The bigger opportunity relates to 'data in motion,' and to the internet of things that will be responsible for generating so much of this.

    My latest report, Streaming Data From The Internet Of Things Will Be The Big Data World’s Bigger Second Act, explores some of the ways that big data vendors are acquiring new skills and new stories with which to chase this new opportunity.

    For CIOs embarking on their IoT journey, it may be time to take a fresh look at companies previously so easily dismissed as just 'doing the Hadoop thing.' 

    Source: Forrester.com, 

  • Gartner: tekort aan experts om Internet of Things te beveiligen

    detailHet Internet of Things brengt allerlei beveiligingsrisico's met zich mee, maar ervaren experts die hierbij kunnen helpen zijn schaars, zo stelt martkvorser Gartner. Volgens Gartner zijn beveiligingstechnologieën vereist om alle Internet of Things-apparaten tegen aanvallen te beschermen.

    Het gaat dan om zowel bekende als nieuwe aanvallen. Gartner wijst naar aanvallen waarbij aanvallers zich als bepaalde apparaten voordoen of "denial-of-sleep-aanvallen" uitvoeren, om zo de batterij van apparaten leeg te maken. De beveiliging van het Internet of Things wordt verder gecompliceerd door het feit dat veel van de apparaten eenvoudige processoren en besturingssystemen gebruiken die geen complexe beveiligingsoplossingen ondersteunen. Ook is er een tekort aan expertise.

    "Ervaren IoT-beveiligingsspecialisten zijn schaars, en beveiligingsoplossingen zijn op het moment gefragmenteerd en bestaan uit verschillende leveranciers", zegt Nick Jones, vicepresident en analist bij Gartner. Jones voorspelt dat er de komende jaren nieuwe dreigingen voor het Internet of Things zullen verschijnen, aangezien hackers nieuwe manieren vinden om aangesloten apparaten en protocollen aan te vallen. Dit zou inhouden dat veel van de IoT-apparaten gedurende hun levenscyclus van hardware- en softwareupdates moeten worden voorzien.

    Source: Security.nl

  • Hoe groot is ‘the next big thing’?

    iotWat als IoT gewoon een overkoepelende term zou zijn voor manieren om iets bruikbaars te maken uit machine-gegenereerde data? Bijvoorbeeld, een bus vertelt mijn telefoon hoe ver mijn bushalte is en mijn fietsverhuur vertelt me ​​hoeveel fietsen beschikbaar zijn?

    In 2014 vroeg IDC 400 C-suite professionals wat volgens hen IoT was. De antwoorden varieerden van soorten apparaten (thermostaten, auto's, home security-systemen) tot uitdagingen (beveiliging, data management, connectiviteit). Dezelfde analist benadrukt ook dat de wereldwijde markt voor IoT oplossingen zal groeien van 1,9 biljoen in 2013 tot 7,1 biljoen dollar in 2020. Dit optimisme wordt ondersteund door Gartner’s inschatting: 4,9 miljard gekoppelde 'dingen' zullen in 2016 in gebruik zijn. In 2020 zullen dat er 25 miljard zijn.
    Met andere woorden: IoT is zeer divers en het potentieel is enorm. De waarde ligt niet alleen in de kosten van de sensoren. Het is veel meer dan dat.

    Wanneer IoT begint te vertellen
    Het IoT is niet iets dat op zichzelf staat. Het rijpt naast big data. Het uitrusten van miljarden objecten met sensoren is van beperkte waarde als het niet mogelijk is miljarden datastromen te genereren, verzenden, opslaan en te analyseren.
    De datawetenschapper is de menselijke choreograaf van dit IoT. Zij zijn essentieel voor het identificeren van de waarde van de enorme hoeveelheid data die al deze apparaten genereren. En dat is de reden waarom connectiviteit en opslag zo belangrijk zijn. Kleine geïsoleerde apparaten zonder opslag en weinig rekenkracht vertellen ons weinig. Alleen door naar grote verzamelingen data te kijken kunnen we correlaties ontdekken en wordt het mogelijk trends te herkennen en voorspellingen te doen.
    In elke zakelijke omgeving, is het scenario identiek: de CxO zal de informatie die er vandaag is bekijken ten opzichte van informatie die er was in het verleden om een voorspelbaar inzicht te krijgen in wat er gaat gebeuren in de toekomst.

    Sneller inzicht leidt tot concurrentievoordeel
    CxO’s willen tegenwoordig een ander soort bedrijf. Ze willen dat het in een snel tempo opereert en reageert op de markt, maar ze willen ook beslissingen nemen op basis van intelligentie verzameld via big data. En ze willen de beste producten maken, gebaseerd op klantinzicht. Bedrijven zijn op zoek naar een disruptief business model waardoor ze steeds meer in kunnen spelen op trends in de markt en daarmee een voorsprong hebben op de concurrentie.

    Start-up gedrag
    Het antwoord ligt in de volgende vraag aan bedrijven: "Waarom kunnen ondernemingen zich niet meer als start-ups gedragen?" Dit gaat niet over het maken van overhaaste beslissingen met weinig of geen overzicht. Het gaat over het aannemen van een slank business model dat onzekerheid en uitgerekte budgetten tolereert. En nog belangrijker, het gaat over hoe het management van het bedrijf een cultuur van slagvaardigheid neerzet.
    De organisaties die zullen winnen in het big data spel zijn niet degenen die de meeste of de beste toegang ertoe hebben. De winnaars omschrijven duidelijk hun doelen, zetten de nodige operationele grenzen en stellen vast wat de uitrusting is die nodig is om de klus te klaren.

    Leidende rol CIO's
    CxO’s hebben de zakelijke waarde van IT erkend, en willen dat CIO's meer een leidende rol nemen en in kaart brengen wat de toekomst is van het bedrijf. IT kan een enorme rol spelen in de bouw van die toekomst door samen te werken met de business en de tools te verschaffen die nodig zijn om productief te zijn. Technologie kan voortdurende innovatie op elk niveau vergemakkelijken, waardoor het bedrijf niet alleen kan overleven maar floreren.
    Het is niet niks om deze wens van bedrijven te bereiken. Maar samenwerken met technologie maakt het veel haalbaarder omdat het bedrijven in staat stelt tot een wendbare, innovatieve, data-gedreven toekomst te komen.

    Source: ManagersOnline

  • How blockchain can change the future of IoT

    IoT-930x580The Internet of Things (IoT) is a fast-growing industry destined to transform homes, cities, farms, factories, and practically everything else by making them smart and more efficient. According to Gartner, by 2020, there will be more than 20 billion connected things across the globe, powering a market that will be worth north of $3 trillion.
     
    But the chaotic growth of IoT will introduce several challenges, including identifying, connecting, securing, and managing so many devices. It will be very challenging for the current infrastructure and architecture underlying the Internet and online services to support huge IoT ecosystems of the future.
     
    This is something that can perhaps be solved through blockchain, the distributed ledger technology behind cryptocurrencies such as Bitcoin and Ethereum, which is proving its worth in many other industries, including IoT.
     
    Blockchain will enable IoT ecosystems to break from the traditional broker-based networking paradigm, where devices rely on a central cloud server to identify and authenticate individual devices.
     
    The blockchain security model
    While the centralized model has worked perfectly in the past decades, it will become problematic when the number of network nodes grows into the millions, generating billions of transactions, because it will exponentially increase computational requirements — and by extension the costs.
     
    The servers can also become a bottleneck and a single point of failure, which will make IoT networks vulnerable to Denial of Service (DoS/DDoS) attacks, where servers are targeted and brought down by being flooded with traffic from compromised devices.
     
    This can critically impact IoT ecosystems, especially as they take on more sensitive tasks.
     
    Moreover, centralized networks will be difficult to establish in many industrial settings such as large farms, where IoT nodes will expand over wide areas with scarce connectivity gear.
     
    Blockchain technology will enable the creation of secure mesh networks, where IoT devices will interconnect in a reliable way while avoiding threats such as device spoofing and impersonation.
     
    With every legitimate node being registered on the blockchain, devices will easily be able to identify and authenticate each other without the need for central brokers or certification authorities, and the network will be scalable to support billions of devices without the need for additional resources.
     
    Several companies are already putting blockchain to use to power IoT networks. One example is Filament, a startup that provides IoT hardware and software for industrial applications such as agriculture, manufacturing, and oil and gas industries.
     
    Filament’s wireless sensors, called Taps, create low-power autonomous mesh networks that enable enterprise companies to manage physical mining operations or water flows over agricultural fields without relying on centralized cloud alternatives. Device identification and intercommunication is secured by a bitcoin blockchain that holds the unique identity of each participating node in the network.
     
    Australian telecommunication giant Telstra is another company leveraging blockchain technology to secure smart home IoT ecosystems. Cryptographic hashes of device firmware are stored on a private blockchain to minimize verification time and obtain real-time tamper resistance and tamper detection.
     
    Since most smart home devices are controlled through mobile apps, Telstra further expands the model and adds user biometric information to the blockchain hashes in order to tie in user identity and prevent compromised mobile devices from taking over the network. This way, the blockchain will be able to verify both the identity of IoT devices and the identity of the people interacting with those devices.
     
    Catering to the future of IoT
    While still in its early development stages, IoT is mostly comprised of technologies that allow for data collection, remote monitoring, and control of devices. As we move forward, IoT will transition toward becoming a network of autonomous devices that can interact with each other and with their environment and make smart decisions without human intervention.
     
    This is where blockchain can shine and form the basis that will support a shared economy based on machine-to-machine (M2M) communications.
     
    We’re already seeing initiatives emerging in this field, including ADEPT (Automated Decentralized P2P Telemetry), a decentralized IoT system created by IBM and Samsung, which enables billions of devices to broadcast transactions between peers and perform self-maintenance.
     
    The platform has been tested in several scenarios, including one that involves a smart washing machine that can automatically order and pay for detergent with bitcoins or ethers when it runs out and will be able to negotiate for the best deal through smart contracts based on its owner’s preferences.
     
    As the backbone of all of these interactions, blockchain creates a secure and democratized platform that is independent and levels the field for all involved parties, making sure everyone plays fair and no single entity is in control.
     
    Blockchain will also enable data monetization, where owners of IoT devices and sensors can share the generated IoT data in exchange for real-time micropayments. Tilepay, for example, offers a secure, decentralized online marketplace where users can register their devices on the blockchain and sell their data in real-time in exchange for digital currency.
     
    Blockchain and IoT also have interesting use cases that can help make renewable energy sources mainstream, where energy produced by IoT solar panels generates cryptocurrency value that is registered on the blockchain. Anyone joining the network can make investments in renewable energy technology. Organizations such as Nasdaq and Chain of Things, a think tank that conducts research on alternative applications for blockchain and IoT, are exploring this field.
     
    Blockchain presents many promises for the future of IoT. Challenges still remain, such as consensus models and the computational costs of verifying transactions. But we are still in the early stages of blockchain development, and these hurdles will eventually be overcome, opening the path for many exciting possibilities.
     
    source: venturebeat.com, November 20, 2016
  • How to Optimize Analytics for Growing Data Stores

    Every minute of every day, mind-blowing amounts of data are generated. Twitter users send 347,222 tweets, YouTube users upload 300 hours of video, and Google receives more than four million search queries. And in a single hour, Walmart processes more than a million customer transactions. With the Internet of Things accelerating at lightning speed – to the tune of 6.4 billion connected devices in 2016 (up 30 percent from 2015) – this already staggering amount of data is about to explode. By 2020, IDC estimates there will be 40 zettabytes of data. That’s 5,200 GB for every person on the planet.

    This data is a gold mine for businesses. Or, at least, it can be. On its own, data has zero value. To turn it into a valuable asset, one that delivers the actionable intelligence needed to transform business, you need to know how to apply analytics to that treasure trove. To set yourself up for success, start out by answering these questions:

    What Is the Size, Volume, Type and Velocity of your Data?

    The answers to this will help you determine the best kind of database to store your data and fuel your analysis. For instance, some databases handle structured data, and others are focused on semi-structured or unstructured data. Some are better with high-velocity and high-volume data.

      RDMS Adaptive NoSQL Specialty In-Memory NewSQL Distributed
    Example DB2, Oracle, MySQL Deep Information Sciences Cloudera, MonoDB, Cassandra Graphing, Column Store, time-series MemSQL, VoltDB NuoDB Hadoop
    Data Type Structured Structured Un/semi-structured Multiple Structured Structured Structured
    Qualities Rich features, ACID compliant, scale issues Fast read/ write, strong scale, ACID, flexible Fast ingest, not ACID compliant Good reading, no writing, ETL delays Fast speed, less scale, ETL delays for analytics Good scale and replication, high overhead Distributed, document-based database, slow batch-based queries

     Which Analytics Use Cases will You Be Supporting?

    The type of use cases will drive the business intelligence capabilities you’ll require (Figure 1).

    • Analyst-driven BI. Operator seeking insights across a range of business data to find cross-group efficiencies, profit leakage, cost challenges, etc.
    • Workgroup-driven BI. Small teams focused on a sub-section of the overall strategy and reporting on KPIs for specific tasks.
    • Strategy-driven BI. Insights mapped against a particular strategy with the dashboard becoming the “single source of truth” for business performance.
    • Process-driven BI. Business automation and workflow built as an autonomic process based on outside events.

    Figure-1-1024x449

    Where Do You Want your Data and Analytics to Live?

    The main choices are on-premises or in the cloud. Until recently, for many companies – particularly those concerned about security – on-prem won out. However, that’s changing significantly as cloud-based solutions have proven to be solidly secure. In fact, a recent survey found that 40 percent of big data practitioners use cloud services for analytics and that number is growing.

    The cloud is attractive for many reasons. The biggest is fast time-to-impact. With cloud-based services you can get up and running immediately. This means you can accelerate insights, actions, and business outcomes. There’s no waiting three to four months for deployment and no risk of development issues.

    There’s also no need to purchase and install infrastructure. This is particularly critical for companies that don’t have the financial resources or skills to set up and maintain database and analytics environments on-premises. Without cloud, these companies would be unable to do the kind of analyses required to thrive in our on-demand economy. However, even companies that do have the resources benefit by freeing up people and budget for more strategic projects.

    With data and analytics in the cloud, collaboration also becomes much easier. Your employees, partners, and customers can instantly access business intelligence and performance management.

    Cloud Options

    There are a number of cloud options you can employ. Here’s a quick look at them:

    Infrastructure as a Service (IaaS) for generalized compute, network, and storage clusters. IaaS is great for flexibility and scale, and will support any software. You will be required to install and manage the software.

    Database as a Service (DBaaS), where multi-tenant or dedicated database instances are hosted by the service provider. DBaaS also is great for flexibility and scale, and it offloads backups and data management to the provider. Your data is locked into the provider’s database solution.

    Analytics as a Service (AaaS) provides complex analytics engines that are ready for use and scale as needed, with pre-canned reports.

    Platform as a Service (PaaS) is similar to DBaaS in that it scales easily and that application backups and data management are handled by the provider. Data solutions themselves are often add-ons.

    Software as a Service (SaaS) is when back office software is abstracted through a hosted application with data made available through APIs. Remote analytics are performed “over the wire” and can be limiting.

    How you leverage data can make or break your business. If you decide to go the cloud route, make sure your service provider’s database and analytics applications fit your current and evolving needs. Make sure the provider has the expertise, infrastructure, and proven ability to handle data ebbs and flows in a way that’s cost-effective for you and, equally important, ensures that your performance won’t be compromised when the data tsunami hits. Your business depends on it.

     Source: DataInformed

  • NoSQL and the Internet of Things

    many-sensorsInternet of Things technology is a hot topic. You can’t read a tech news site without coming across at least one mention of IoT. But if you’re looking to take advantage of sensors, you will likely have to update your data store to handle the workload. Once you’re set up data-wise, get ready to monitor everything from weather and the environment to overseas factory floors and even fleets of trucks.

    Why NoSQL for IoT?

    You might think your data needs for sensors are as tiny as these little devices, but there are several reasons you should consider a NoSQL database.

    The first reason is that these sensors can send huge amounts of data since they run 24/7. All of that data adds up to the need for a larger storage capacity. While you might be tempted to use an RDBMS, relational databases were never really meant to deal with the kind of data that sensors generate. For one thing, sensor data doesn’t always make sense in tabular format.

    SQL was originally designed for relatively static data structured as a table. Data from sensors can change a lot and provides a continuous stream. And you need to be able to add or remove entries on the fly, which can prove difficult with relational databases.

    NoSQL databases are also more scalable, offering flexibility in data models. You can have a structure similar to SQL with wide tables, or you might choose to go with a document-oriented database, key-value database, or graph database. Time series databases are one of the more obvious choices for Internet of Things applications specifically.

    Some businesses may join the big data revolution without knowing where they are actually going to store their data. You could have a cluster dedicated to your data and another to your analytics, but that’s expensive. Wouldn’t it be great if you could have your data and analytics in the same cluster? NoSQL eliminates budget waste for those with two different clusters that amount to the same thing.

    Applications

    So now that you’ve got your IoT-capable database, what can you do with Internet of Things technology?

    In “The Only Living Boy in New York,” Paul Simon famously got all the news he needed on the weather report. While the weather might seem to most of us like no more than a cliche, “safe” topic for conversation, for many people, receiving the weather report is a matter of safety and survival.

    The severe weather that has impacted much of the U.S. in recent years shows how timely weather forecasts can save lives by allowing forecasters to give accurate, quick, and up-to-the-moment warnings and alerts. Both the National Weather Service and private forecasters use sophisticated models to predict the weather, and those models get better all the time. One of the primary reasons they continue to improve is that the forecasters feed the programs with real data gathered from weather stations around the world. The Weather Channel acquired Weather Underground largely for its extensive network of weather stations operated by enthusiasts.

    If you’re not interested in weather monitoring, IoT offers other options, such as monitoring pollution. Sensors can measure particles in the air, or chemicals and bacteria in the water. Agencies could use this information to plan congestion pricing for commuters or direct cleanup resources.

    Nearly every State in the U.S. has called for more manufacturers to bring jobs back to the U.S. instead of offshoring, but manufacturers cite high costs as a reason to keep factory jobs overseas. One way to monitor operations abroad is to deploy IoT on factory floors. While automated process control is nothing new, what is new is the ability to connect directly to factory floors from around the world. Businesses can monitor production and instantly track problems before they become big ones.

    One of the biggest successes for Internet of Things industry is in logistics, particularly fleet tracking. Trucking companies can see instantly where their vehicles are, and customers can know exactly where their stuff is. Managers can even track fuel usage and see when trucks are due for maintenance. All of these factors help logistics companies cut costs, save fuel, and keep customers.

    Conclusion

    NoSQL may be just the solution you need to venture into IoT technology. With the ability to handle the vast workloads from sensors running 24/7, you’ll be able to react to new situations quickly. NoSQL can help you save money, save time, and even save lives. 

    Source: Smartdatacollective

  • Security Concerns Grow As Big Data Moves to Cloud

    red-hacked-symbol-200x133Despite exponential increases in data storage in the cloud along with databases and the emerging Internet of Things (IoT), IT security executives remain worried about security breaches as well as vulnerabilities introduced via shared infrastructure.

    A cloud security survey released Wednesday (Feb. 24) by enterprise data security vendor Vormetric and 451 Research found that 85 percent of respondents use sensitive data stored in the cloud, up from 54 percent last year. Meanwhile, half of those surveyed said they are using sensitive data within big data deployments, up from 31 percent last year. One-third of respondents said they are accessing sensitive data via IoT deployments.

    The upshot is that well over half of those IT executive surveyed are worried about data security as cloud usage grows, citing the possibility of attacks on service providers, exposure to vulnerabilities on shared public cloud infrastructure and a lack of control over where data is stored.

    Those fears are well founded, the security survey notes: “To a large extent both security vendors and enterprises are like generals fighting the last war. While the storm of data breaches continues to crest, many remain focused on traditional defenses like network and endpoint security that are clearly no longer sufficient on their own to respond to new security challenges.”

    Control and management of encryption keys is widely seen as critical to securing data stored in the cloud, the survey found. IT executives were divided on the question of managing encryption keys, with roughly half previously saying that keys should be managed by cloud service providers. That view has shifted in the past year, the survey found, with 65 percent now favoring on-premise management of encryption keys.

    In response to security concerns, public cloud vendors like Amazon Web Services, Google, Microsoft and Salesforce have moved to tighten data security through internal development, partnerships and acquisitions in an attempt to reduce vulnerabilities. Big data vendors have lagged behind, but the survey noted that acquisitions by Cloudera and Hortonworks represent concrete steps toward securing big data.

    Cloudera acquired encryption and key management developer Gazzang in 2014 to boost Hadoop security. Among Hortonworks’ recent acquisitions is XA Secure, a developer of security tools for Hadoop.

    Still, the survey warned, IoT security remains problematic.

    When asked which data resources were most at risk, 54 percent of respondents to the Vormetric survey cited databases while 41 percent said file servers. Indeed, when linked to the open Internet, these machines can be exposed vulnerabilities similar to recent “man-in-the-middle” attacks on an open source library.

    (Security specialist SentinelOne released an endpoint platform this week designed to protect enterprise datacenters and cloud providers from emerging threats that target Linux servers.)

    Meanwhile, the top security concerns for big data implementations were: the security of reports that include sensitive information; sensitive data spread across big data deployments; and privacy violations related to data originating in multiple countries. Privacy worries have been complications by delays in replacing a 15-year-old “safe harbor” agreement struck down last year that governed trans-Atlantic data transfers. A proposed E.U.-U.S. Privacy Shield deal has yet to be implemented.

    Despite these uncertainties and continuing security worries, respondents said they would continue shifting more sensitive data to the cloud, databases and IoT implementations as they move computing resources closer to data. For example, half of all survey respondents said they would store sensitive information in big data environments.

    Source: Datanami

  • The mobile revolution is over. Get ready for the next big thing: Robots

    barbieThe computer industry moves in waves. We're at the tail end of one of those waves — the mobile revolution. What's next? Robots.

    But not the way you think.

    The robot revolution won't be characterized by white plastic desk lamps following you around asking questions in a creepy little-girl voice, like I saw at last week's Consumer Electronics Show in Las Vegas. That might be a part of it, but a small part. Rather, it'll be characterized by dozens of devices working on your behalf, invisibly, all the time, to make your life more convenient.

    Some people in the industry use the term "artificial intelligence" or "digital assistants." Others talk about "smart" devices. But none of these terms capture how widespread and groundbreaking this revolution will be. This isn't just about a coffee maker that knows to turn itself on when your alarm goes off, or a thermostat that adjusts to your presence.

    (And "Internet of Things" — please stop already.)

    This is about every piece of technology in your life working together to serve you. Robots everywhere, all the time. Not like the Roomba. More like the movie "Her."

    Where've we been?

    Every 10 or 15 years, a convergence of favorable economics and technical advances kicks off a revolution in computing. Mainstream culture changes dramatically. New habits are formed. Multibillion-dollar companies are created. Companies and entire industries are disrupted and die. I've lived through three of these revolutions.

    • The PC revolution. This kicked off in the 1980s with the early Apple computers and the quick-following IBM PC, followed by the PC clones. Microsoft and Intel were the biggest winners. IBM was most prominent among the big losers, but there were many others — basically, any company that thought computing would remain exclusively in the hands of a few huge computers stored in a data center somewhere. By the end, Microsoft's audacious dream of "a computer on every desk and in every home" was real.

    • The internet revolution. This kicked off in the mid 1990s with the standardization of various internet protocols, followed by the browser war and the dot-com boom and bust. Amazon and Google were the biggest winners. Industries that relied on physical media and a distribution monopoly, like recorded music and print media, were the biggest losers. By the end, everybody was online and the idea of a business not having a website was absurd.

    • The mobile revolution. This kicked off in 2007 with the launch of the iPhone. Apple and Samsung were the biggest winners. Microsoft was among the big losers, as its 20-year monopoly on personal computing finally broke.

    A couple of important points:

    First, when a revolution ends, that doesn't mean the revolutionary technology goes away. Everybody still has a PC. Everybody still uses the internet. It simply means that the technology is so common and widespread that it's no longer revolutionary. It's taken for granted.

    So: The mobile revolution is over.

    More than a billion smartphones ship every year. Apple will probably sell fewer iPhones this year than last year for the first time since the product came out. Huge new businesses have already been built on the idea that everybody will have an internet-connected computer in their pocket at all times — Uber wouldn't make sense without a smartphone, and Facebook could easily have become a historical curiosity like MySpace if it hadn't jumped into mobile so adeptly. This doesn't mean that smartphones are going away, or that Apple is doomed, or any of that nonsense. But the smartphone is normal now. Even boring. It's not revolutionary.

    The second thing to note is that each revolution decentralized power and distributed it to the individual.

    The PC brought computing power out of the bowels of the company and onto each desk and into each home. The internet took reams of information that had been locked up in libraries, private databases, and proprietary formats (like compact discs) and made it available to anybody with a computer and a phone line.

    The smartphone took those two things and put them in our pockets and purses.

    Tomorrow and how we get there

    This year's CES seemed like an "in-betweener." Everybody was looking for the next big thing. Nothing really exciting dominated the show.

    There were smart cars, smart homes, drones, virtual reality, wearable devices to track athletic performance, smart beds, smart luggage (really), and, yeah, weird little robots with anime faces and little-girl voices.

    But if you look at all these things in common, plus what the big tech companies are investing in right now, a picture starts to emerge.

    • Sensors and other components are dirt cheap. Thanks to the mobile revolution creating massive scale for the components that go into phones and tablets, sensors of every imaginable kind — GPS, motion trackers, cameras, microphones — are unimaginably cheap. So are the parts for sending bits of information over various wireless connections — Bluetooth LTE, Wi-Fi, LTE, whatever. These components will continue to get cheaper. This paves the way for previously inanimate objects to collect every kind of imaginable data and send simple signals to one another.

    • Every big tech company is obsessed with AI. Every single one of the big tech companies is working on virtual assistants and other artificial intelligence. Microsoft has Cortana and a bunch of interesting behind-the-scenes projects for businesses. Google has Google Now, Apple has Siri, Amazon has Echo, even Facebook is getting into the game with its Facebook M digital assistant. IBM and other big enterprise companies are also making huge investments here, as are dozens of venture-backed startups.

    • Society is ready. This is the most important point. Think about how busy we are compared with ten or twenty years ago. People work longer hours, or stitch together multiple part-time jobs to make a living. Parenting has become an insane procession of activities and playdates. The "on-demand" economy has gone from being a silly thing only business blogs write about to a mainstream part of life in big cities, and increasingly across the country — calling an Uber isn't just for Manhattan or San Francisco any more. This is the classic situation ahead of a computing revolution — everybody needs something, but they don't know they need it yet.

    So imagine this. In 10 years, you pay a couple-hundred bucks for a smart personal assistant, which you install on your phone as an app. It collects a bunch of information about your actions, activities, contacts, and more, and starts learning what you want. Then it communicates with dozens of other devices and services to make your life more convenient.

    Computing moves out of your pocket and into the entire environment that surrounds you.

    Your alarm is set automatically. You don't need to make a to-do list — it's already made. Mundane phone calls like the cable guy and the drugstore are done automatically for you. You don't summon an Uber — a car shows up exactly when you need it, and the driver already knows the chain of stops to make. (Eventually, there won't be a driver at all.)

    If you're hungry and in a hurry, you don't call for food — your assistant asks what you feel like for dinner or figures out you're meeting somebody and orders delivery or makes restaurant reservations. The music you like follows you not just from room to room, but from building to building. Your personal drone hovers over your shoulder, recording audio and video from any interaction you need it to (unless antidrone technology is jamming it).

    At first, only the wealthy and connected have this more automated lifestyle. "Have your assistant call my assistant." But over time, it trickles down to more people, and soon you can't remember what life was like without one. Did we really have to make lists to remember to do all this stuff ourselves?

    This sounds like science fiction, and there's still a ton of work ahead to get there. Nobody's invented the common way for all these devices to speak to each other, much less the AI that can control them and stitch them together. So this revolution is still years away. But not that far.

    If you try to draw a comparison with the mobile revolution, we're still a few years from the iPhone. We're not even in the BlackBerry days yet. We're in the Palm Pilot and flip-phone days. The basic necessary technology is there, but nobody's stitched it together yet.

    But when they do — once again — trillion-dollar companies and industries will rise and fall, habits will change, and everybody will be blown away for a few years. Then, we'll all take it for granted.

    Source: Business Insider

  • TNO: ‘Amsterdam blijft bereikbaar dankzij big data’

    1000Innovatieorganisatie TNO ziet kansen voor big data en Internet of Things-technologie (IoT) om de bereikbaarheid van de metropoolregio Amsterdam te vergroten. “Met big data kunnen we meerdere oplossingen aan elkaar koppelen om de infrastructuur van een stad optimaal te benutten”, zegt Leo Kusters, Managing Director Urbanisation bij TNO.

    Binnen enkele decennia woont 70 procent van de wereldbevolking in grote steden of in sterk verstedelijkte regio’s. Het economische en culturele succes van regio’s als de Randstad trekt veel mensen. De infrastructuur van deze steden wordt daardoor steeds meer belast. Infrastructuur en mobiliteit zijn daarom bepalende factoren voor het succes van de grootstedelijke regio’s.

    Slimme mobiliteit

    Kusters wijst op het project Praktijkproef Amsterdam (PPA), waarin TNO samenwerkt met ARS Traffic & Transport Technology aan het verminderen van files in de regio Amsterdam. “Aan dit project zijn 15.000 automobilisten verbonden”, zegt Kusters. Door weggebruikers beter te informeren over de verkeerssituatie in de stad, verwacht TNO dat het aantal files in de regio Amsterdam afneemt.

    De deelnemers hebben de beschikking over een app waarmee ze op individueel niveau geïnformeerd worden over de beste reiskeuzes die ze kunnen maken. Daarnaast kunnen gebruikers via de app ook zelf incidenten en vertragingen op de weg melden. Hierdoor komen de automobilisten sneller op hun bestemming en kunnen ze rekenen op een betrouwbare reistijd.

    Bijzonder aan dit project is volgens Kusters dat de app ook advies geeft op basis van verkeerslichten die op rood staan. Vervolgens houdt het systeem rekening met deze verkeerslichten om een opstopping op de weg te voorkomen.

    TNO voert een vergelijkbaar project uit met vrachtverkeer in Helmond. Kusters: “Door de stad Helmond loopt een snelweg waar veel vrachtauto’s overheen rijden. Hierdoor is er in de stad veel belasting voor het milieu en de luchtkwaliteit.” In dit project experimenteert TNO met data-analyse om de doorstroming voor de betrokken vrachtwagens te optimaliseren. De chauffeurs krijgen doorlopend snelheidsadviezen om de doorstroming in de stad te verbeteren. Hierdoor hoeven chauffeurs minder te stoppen in de stad. Vrachtwagens verbruiken daardoor minder brandstof.

    Twee vliegen in één klap

    Een grote kans van big data en de toepassing van IoT-technologie ligt volgens Kusters in het combineren van meerdere oplossingen voor optimale benutting van bestaande infrastructuur. Big data kan ook bijdragen aan besparingen in het onderhoud van de infrastructuur, waar Nederland jaarlijks € 6 mrd aan uitgeeft.

    TNO richt zich bijvoorbeeld op het verlengen van de levensduur van bruggen. ”Een essentieel onderdeel van de infrastructuur”, zegt Kusters. “Als bruggen niet werken, staat alles stil.” TNO meet met sensoren de haarscheurtjes in bruggen. “Zo kunnen we precies weten wanneer een brug onderhoud nodig heeft of moet worden vervangen. Dit maakt het mogelijk om de levensduur van de brug ‘op maat’ te verlengen. Dus precies op tijd met een minimum aan overlast voor het verkeer.”

    De levensduur van infrastructuuronderdelen wordt meestal bepaald op basis van theoretische modellen. Kusters: “Omdat de werkelijkheid altijd anders is, ontwikkelt TNO met Rijkswaterstaat nieuwe meetmethodes. Het gebruik van infrastructuur kan in de praktijk intensiever of juist minder intensief zijn in vergelijking met de inschatting uit theoretische modellen, en de schade dus ook. Door big data in te zetten, kunnen we nauwkeurige voorspellingen maken voor het onderhoud van de brug en daarmee kosten besparen.”

    De coöperatieve auto

    Bij deze projecten is de betrokkenheid van verschillende partijen van groot belang, meent Kusters. “Mobiliteit is allang niet meer het alleenrecht van de overheid. De overheid neemt een andere rol aan bij de verduurzaming van infrastructuur en mobiliteit. Ook technologiebedrijven worden steeds belangrijker. Dat zijn bedrijven als TomTom en Google, maar ook een partij als chipleverancier NXP, die kunnen bijdragen aan de ontwikkeling van technologie om voertuigen met elkaar te laten communiceren.”

    De TNO-directeur spreekt over de ‘coöperatieve auto’. “Dat betekent dat alle diensten en modaliteiten waar je als automobilist gebruik van wil maken, aan elkaar worden gekoppeld. Het systeem gaat dan als het ware met je mee denken.”

    De coöperatieve auto maakt gebruik van IoT-technologie om rechtstreeks met andere voertuigen of de infrastructuur te communiceren. Hierdoor houdt de auto continu rekening met de huidige verkeerssituatie en de voertuigen die in dezelfde omgeving rijden. Kusters: “Dat is een grote doorbraak, een efficiënte deels-zelfrijdende auto die altijd oplet en altijd wakker is. Zo kunnen we de wegcapaciteit stevig laten toenemen en een flink deel van de fileproblemen oplossen.”

    Toekomstvisie

    De Managing Director Urbanisation ziet de IoT-toepassingen voor mobiliteit in rap tempo toenemen. "De autonoom zelfrijdende auto in de stad is misschien wel minder ver weg dan we denken”, zegt Kusters. “We hebben al auto’s die zelf kunnen parkeren. In de toekomst betekent dit dat de parkeerproblemen in de grote steden ten einde lopen.”

    Naast de IoT-toepassing voor coöperatieve auto’s, ziet Kusters ook kansen voor verbeteringen aan de infrastructuur. “Het verbonden zijn van mensen en van apparaten zal ook terug te zien zijn op het straatbeeld, zoals wifi op straat, wifi voor auto’s, en slimme LED-verlichting. Dat betekent overigens niet dat al die informatie over één en hetzelfde netwerk zal gaan. De informatie die tijdkritisch is en de verkeersveiligheid beïnvloedt, zal bijvoorbeeld gebruikmaken van een apart netwerk. Dit gaan we in steden en op snelwegen binnen een paar jaar in de praktijk zien.”

    In de toekomst ziet de directeur leefomgeving van TNO ook meer veranderingen in het aanzicht van de binnenstad. “In de stad gaan we meer en meer elektrisch rijden. Dat zien we al in recente openbaar vervoersaanbestedingen.” Ook fietsersaantallen zullen volgens Kusters nog verder groeien. “In een stad als Amsterdam is er dan meer ruimte nodig voor de fiets”, zegt Kusters. “Dit is de enige vorm van mobiliteit die in Amsterdam toeneemt. Meer ruimte voor fietsers is daarom belangrijk. Dat gaat wel ten koste van de parkeerplaatsen van de auto’s, maar hoeft dan niet zomaar ten koste te gaan van de bereikbaarheid.”

    Source: DuurzaamBedrijfsleven

EasyTagCloud v2.8