44 items tagged "Cloud"

  • ‘Privacy als integraal onderdeel van security’

    Experts doen voorspellingen voor 2016

    4726367In 2016 staat veel te gebeuren op het vlak van privacy en dataprotectie. Hoewel de discussie ‘security versus privacy’ dit jaar al is begonnen, barst deze in 2016 echt los. Computable-experts bespreken verschillende it-beveiligingsvoorspellingen voor het komende jaar. Hierbij worden onder andere cloudbeveiliging en de integratie van beveiligingsoplossingen aangehaald.

    Privacy
    Richard van Lent, managing partner bij MITE Systems:
    2015 stond vooral in het teken van onderwerpen als ‘Cloud First’, ‘Mobile First’ & internet of things (IoT). Daarnaast heeft de Wet Meldplicht Datalekken, een toevoeging op de bestaande Wbp, de afgelopen maanden nogal wat stof doen opwaaien binnen afdelingen als hr, legal/risk compliance, security en ict. Data Analytics-oplossingen gaan voor de meeste bedrijven rand voorwaardelijk worden als het gaat om het borgen van de privacy, compliancy en security in de dagelijkse operatie. Een belangrijke trend voor 2016, die zich momenteel in een rap tempo ontwikkelt, is dan ook het inzetten van Data Analytics-oplossingen om zaken als beveiliging, voorspelbaarheid & gedrag bijna in realtime inzichtelijk te maken.

    Gerard Stroeve, manager Security & Continuity Services bij Centric:
    Een andere belangrijke security-trend voor 2016 is privacy. Op het gebied van privacy staat er het komende jaar veel te gebeuren. Neem bijvoorbeeld de meldplicht datalekken. Deze gaat op 1 januari 2016 in en verplicht organisaties (zowel bedrijven als overheden) om ernstige datalekken direct te melden. Ook krijgt het College Bescherming Persoonsgegevens (CBP), dat vanaf 1 januari verder gaat als de Autoriteit Persoonsgegevens, boetebevoegdheid tot 820.000 euro. Daarnaast vormt de Europese Algemene Data Protectie Verordening een belangrijke internationale ontwikkeling op het gebied van privacy. Naar verwachting wordt ook deze begin 2016 van kracht met een overgangsperiode van zo’n anderhalf jaar. Komend jaar moeten organisaties echt aan de slag met de voorbereiding op die nieuwe wetgeving.

    Het is daarbij belangrijk om privacy niet als een op zichzelf staand onderwerp te benaderen. Privacy heeft specifieke wet- en regelgeving, maar is wel een integraal onderdeel van informatiemanagement en -beveiliging als geheel.

    Lex Borger, Principal Consultant bij I-to-I:
    De politieke discussie ‘security versus privacy’ is al begonnen, maar gaat in het verkiezingsjaar in de Verenigde Staten echt losbarsten. Hoeveel privacy moeten we opgeven om terrorisme te bestrijden? Mogen overheden eisen stellen om toegang te krijgen tot informatie die versleuteld verzonden wordt of opgeslagen is? Kan terrorisme zo bestreden worden? Kunnen we overheden vertrouwen met die mogelijkheden? Kunnen we voorkomen dat anderen (overheden, georganiseerde criminaliteit) hier misbruik van maken? Mag de burger nog wat te verbergen hebben? Voldoende gelegenheid tot discussie, ik ben benieuwd.

    Cloud, integratie en IoT
    Gerard Stroeve, manager Security & Continuity Services, Centric:
    De derde belangrijke trend voor 2016 is cloudsecurity. Veel organisaties geven aan nog te weinig grip te hebben op de cloudoplossingen die zij gebruiken. Hoe vind je bijvoorbeeld afstemming met grote, publieke cloudleveranciers als Google, Microsoft en Amazon? Een andere grote uitdaging vormt de zogenaamde ‘Shadow it, een fenomeen waarmee bijna iedere organisatie tegenwoordig te maken heeft. Wanneer de juiste functionaliteit niet of niet snel genoeg beschikbaar is, zoeken medewerkers daarvoor steeds vaker zelf een oplossing in de cloud. Zo ontstaat er langzaam maar zeker een wildgroei aan gebruikte cloudoplossingen waarop de organisatie geen grip heeft. Dat vraagt om helder beleid en goede richtlijnen rondom veilig cloudgebruik, zonder daarbij de productiviteit in de weg te zitten. Een praktisch handvat hierbij vormt de 4C-benadering.

    John Veldhuis, Senior System Consultant bij Sophos:
    Zoals we in de gateway een integratie hebben gezien van voorheen losstaande apparaten voor email/web proxy, vpn, packet filtering, layer 7 filtering, load balancing et cetera. in een UTM, gaan we een doorontwikkeling meemaken die ervoor zorgt dat anti-malware, UTM en versleutelingsoplossingen met elkaar praten. Voorbeeld: Verdacht verkeer, bijvoorbeeld ransomware gerelateerd, wordt gedetecteerd. Dit kan door software op de besmette machine zijn, maar ook door de UTM. Hierdoor kan automatisch:

    • de machine ontdaan worden van sleutels, zodat vertrouwelijke data niet gelekt kan worden, maar ook het bewerken ervan (versleutelen door ransomware) niet meer kan plaatsvinden (geen sleutel = geen toegang);
    • de machine in een quarantaine- of mitigatienetwerk worden geplaatst, op de overige machines een zoekactie worden gestart naar het bestand dat het verkeer veroorzaakte de herkomst van het bestand worden gezocht en in een reputatiefilter worden gezet et cetera.

    Harm de Haan manager consultancy bij Telindus:
    Door de toenemende complexiteit van it-infrastructuren en de mate waarin onderdelen met elkaar geïntegreerd zijn, is security steeds moeilijker te garanderen. Bovendien zijn de gevolgen van een ´breach´ groter, door de nieuwe Wet Meldplicht Datalekken. Hierdoor is security voor veel organisaties een belangrijk thema in 2016. Endpoint solutions kunnen veiligheid onvoldoende garanderen, juist door de complexiteit van moderne infrastructuren. Beter zouden organisaties zich oriënteren op ‘security by design’: een aanpak waarbij security in de ontwerpfase van een infrastructuur wordt ingericht als onderdeel van de losse componenten, zoals compute, networking en storage.

    Lex Borger, Principal Consultant bij I-to-I:
    Geen excuus meer: TLlskan nu overal en gratis. Ttp's moeten hun business-model veranderen, Let's Encrypt levert gratis tls-certificaten met automatische provisioning. Jouw site kan altijd https aan. Geen dure certificaten te beheren, geen moeilijk proces voor aanvraag, maar je moet wel aantonen dat je zeggenschap hebt over je website. Voor velen zal dit veilig genoeg zijn. Het betaalde proces voor certificaatbeheer is complex genoeg, het is absurd dat er nog gepleit wordt om SHA-1 langer geldig te laten zijn omdat we onze certificaten niet snel genoeg kunnen opsporen en vervangen.

    Ook zal het internet of things in 2016 explosief groeien met sensors en kleine automaatjes die eenvoudig moeten werken en simpel aan te sluiten zijn. Dit geeft heel veel mogelijkheden voor hackers om ik weet niet wat te doen (Ik ben niet creatief genoeg om te bedenken wat er allemaal mogelijk is). We gaan er flink last van krijgen dat onze netwerkinfrastructuren inherent onveilig zijn. Het is nog steeds eenvoudig je voor te doen als een ander apparaat op het internet. Dit gegeven ligt aan de basis van veel aanvallen - DDoS, identity theft.

    Tot slot: een gedegen basisproces
    Los van deze drie thema’s vraagt informatiebeveiliging vooral een integrale benadering, meent Gerard Stroeve. ‘Informatiebeveiliging is een breed vakgebied met verschillende aandachtsgebieden. Naast de genoemde thema’s, is er bijvoorbeeld cybersecurity. Het aantal DDoS- of ransomware-aanvallen zal komend jaar niet afnemen. Ook verwachten we dat de aandacht voor beschikbaarheid en continuïteitsmanagement het komend jaar zal toenemen.’

    Om effectief met al deze uiteenlopende dreigingen om te gaan, is een gedegen basisproces volgens hem cruciaal. ‘Door een goed governancemodel ben je in staat te reageren op nieuwe en veranderende dreigingen. Hierbij staan classificatie en het analyseren van risico’s centraal. Belangrijk is ook dat informatiebeveiliging een stevig managementdraagvlak krijgt.’

    Source: Computable

  • ‘Vooruitgang in BI, maar let op ROI’

    5601405Business intelligence (bi) werd door Gartner al benoemd tot hoogste prioriteit voor de cio in 2016. Ook de Computable-experts voorspellen dat er veel en grote stappen genomen gaan worden binnen de bi. Tegelijkertijd moeten managers ook terug kijken en nadenken over hun businessmodel bij de inzet van big data: hoe rechtvaardig je de investeringen in big data?

    Kurt de Koning, oprichter van Dutch Offshore ICT Management
    Business intelligence/analytics is door Gartner op nummer één gezet voor 2016 op de prioriteitenlijst voor de cio. Gebruikers zullen in 2016 hun beslissingen steeds meer laten afhangen van stuurinformatie die uit meerdere bronnen komt. Deze bronnen zullen deels bestaan uit ongestructureerde data. De bi-tools zullen dus niet alleen visueel de informatie aantrekkelijk moeten opmaken en een goede gebruikersinterface moeten bieden. Bij het ontsluiten van de data zullen die tools zich onderscheiden , die in staat zijn om orde en overzicht te scheppen uit de vele verschijningsvormen van data.

    Laurent Koelink, senior interim BI professional bij Insight BI
    Big data-oplossingen naast traditionele bi
    Door de groei van het aantal smart devices hebben organisaties steeds meer data te verwerken. Omdat inzicht (in de breedste zin) een van de belangrijkste succesfactoren van de toekomst gaat zijn voor veel organisaties die flexibel in willen kunnen spelen op de vraag van de markt, zullen zijn ook al deze nieuwe (vormen) van informatie moeten kunnen analyseren. Ik zie big data niet als vervangen van traditionele bi-oplossingen, maar eerder als aanvulling waar het gaat om analytische verwerking van grote hoeveelheden (vooral ongestructureerde) data.

    In-memory-oplossingen
    Organisaties lopen steeds vaker aan tegen de performance-beperkingen van traditionele database systemen als het gaat om grote hoeveelheden data die ad hoc moeten kunnen worden geanalyseerd. Specifieke hybride database/hardware-oplossingen zoals die van IBM, SAP en TeraData hebben hier altijd oplossingen voor geboden. Daar komen nu steeds vaker ook in-memory-oplossingen bij. Enerzijds omdat deze steeds betaalbaarder en dus toegankelijker worden, anderzijds doordat dit soort oplossingen in de cloud beschikbaar komen, waardoor de kosten hiervan goed in de hand te houden zijn.

    Virtual data integration
    Daar waar data nu nog vaak fysiek wordt samengevoegd in aparte databases (data warehouses) zal dit, waar mogelijk, worden vervangen door slimme metadata-oplossingen, die (al dan niet met tijdelijke physieke , soms in memory opslag) tijdrovende data extractie en integratie processen overbodig maken.

    Agile BI development
    Organisaties worden meer en meer genoodzaakt om flexibel mee te bewegen in en met de keten waar ze zich in begeven. Dit betekent dat ook de inzichten om de bedrijfsvoering aan te sturen (de bi-oplossingen) flexibel moeten mee bewegen. Dit vergt een andere manier van ontwikkelen van de bi-ontwikkelteams. Meer en meer zie je dan ook dat methoden als Scrum ook voor bi-ontwikkeling worden toegepast.

    Bi voor de iedereen
    Daar waar bi toch vooral altijd het domein van organisaties is geweest zie je dat ook consumenten steeds meer en vaker gebruik maken van bi-oplossingen. Bekende voorbeelden zijn inzicht in financiën en energieverbruik. De analyse van inkomsten en uitgaven op de webportal of in de app van je bank, maar ook de analyse van de gegevens van slimme energiemeters zijn hierbij sprekende voorbeelden. Dit zal in de komende jaren alleen maar toenemen en geïntegreerd worden.

    Rein Mertens, head of analytical platform bij SAS
    Een belangrijke trend die ik tot volwassenheid zie komen in 2016 is ‘streaming analytics’. Vandaag de dag is big data niet meer weg te denken uit onze dagelijkse praktijk. De hoeveelheid data welke per seconde wordt gegenereerd blijft maar toenemen. Zowel in de persoonlijke als zakelijke sfeer. Kijk maar eens naar je dagelijkse gebruik van het internet, e-mails, tweets, blog posts, en overige sociale netwerken. En vanuit de zakelijke kant: klantinteracties, aankopen, customer service calls, promotie via sms/sociale netwerken et cetera.

    Een toename van volume, variatie en snelheid van vijf Exabytes per twee dagen wereldwijd. Dit getal is zelfs exclusief data vanuit sensoren, en overige IoT-devices. Er zit vast interessante informatie verstopt in het analyseren van al deze data, maar hoe doe je dat? Een manier is om deze data toegankelijk te maken en op te slaan in een kosteneffectief big data-platform. Onvermijdelijk komt een technologie als Hadoop dan aan de orde, om vervolgens met data visualisatie en geavanceerde analytics aan de gang te gaan om verbanden en inzichten uit die data berg te halen. Je stuurt als het ware de complexe logica naar de data toe. Zonder de data allemaal uit het Hadoop cluster te hoeven halen uiteraard.

    Maar wat nu, als je op basis van deze grote hoeveelheden data ‘real-time’ slimme beslissingen zou willen nemen? Je hebt dan geen tijd om de data eerst op te slaan, en vervolgens te gaan analyseren. Nee, je wilt de data in-stream direct kunnen beoordelen, aggregeren, bijhouden, en analyseren, zoals vreemde transactie patronen te detecteren, sentiment in teksten te analyseren en hierop direct actie te ondernemen. Eigenlijk stuur je de data langs de logica! Logica, die in-memory staat en ontwikkeld is om dat heel snel en heel slim te doen. En uiteindelijke resultaten op te slaan. Voorbeelden van meer dan honderdduizend transacties zijn geen uitzondering hier. Per seconde, welteverstaan. Stream it, score it, store it. Dat is streaming analytics!

    Minne Sluis, oprichter van Sluis Results
    Van IoT (internet of things) naar IoE (internet of everything)
    Alles wordt digitaal en connected. Meer nog dan dat we ons zelfs korte tijd geleden konden voorstellen. De toepassing van big data-methodieken en -technieken zal derhalve een nog grotere vlucht nemen.

    Roep om adequate Data Governance zal toenemen
    Hoewel het in de nieuwe wereld draait om loslaten, vertrouwen/vrijheid geven en co-creatie, zal de roep om beheersbaarheid toch toenemen. Mits vooral aangevlogen vanuit een faciliterende rol en zorgdragend voor meer eenduidigheid en betrouwbaarheid, bepaald geen slechte zaak.

    De business impact van big data & data science neemt toe
    De impact van big data & data science om business processen, diensten en producten her-uit te vinden, verregaand te digitaliseren (en intelligenter te maken), of in sommige gevallen te elimineren, zal doorzetten.

    Consumentisering van analytics zet door
    Sterk verbeterde en echt intuïtieve visualisaties, geschraagd door goede meta-modellen, dus data governance, drijft deze ontwikkeling. Democratisering en onafhankelijkheid van derden (anders dan zelfgekozen afgenomen uit de cloud) wordt daarmee steeds meer werkelijkheid.

    Big data & data science gaan helemaal doorbreken in de non-profit
    De subtiele doelstellingen van de non-profit, zoals verbetering van kwaliteit, (patiënt/cliënt/burger) veiligheid, punctualiteit en toegankelijkheid, vragen om big data toepassingen. Immers, voor die subtiliteit heb je meer goede informatie en dus data, sneller, met meer detail en schakering nodig, dan wat er nu veelal nog uit de traditionelere bi-omgevingen komt. Als de non-profit de broodnodige focus van de profit sector, op ‘winst’ en ‘omzetverbetering’, weet te vertalen naar haar eigen situatie, dan staan succesvolle big data initiatieven om de hoek! Mind you, deze voorspelling geldt uiteraard ook onverkort voor de zorg.

    Hans Geurtsen, business intelligence architect data solutions bij Info Support
    Van big data naar polyglot persistence
    In 2016 hebben we het niet meer over big, maar gewoon over data. Data van allerlei soorten en in allerlei volumes die om verschillende soorten opslag vragen: polyglot persistence. Programmeurs kennen de term polyglot al lang. Een applicatie anno 2015 wordt vaak al in meerdere talen geschreven. Maar ook aan de opslag kant van een applicatie is het niet meer alleen relationeel wat de klok zal slaan. We zullen steeds meer andere soorten databases toepassen in onze data oplossingen, zoals graph databases, document databases, etc. Naast specialisten die alles van één soort database afweten, heb je dan ook generalisten nodig die precies weten welke database zich waarvoor leent.

    De doorbraak van het moderne datawarehouse
    ‘Een polyglot is iemand met een hoge graad van taalbeheersing in verschillende talen’, aldus Wikipedia. Het gaat dan om spreektalen, maar ook in het it-vakgebied, kom je de term steeds vaker tegen. Een applicatie die in meerdere programmeertalen wordt gecodeerd en data in meerdere soorten databases opslaat. Maar ook aan de business intelligence-kant volstaat één taal, één omgeving niet meer. De dagen van het traditionele datawarehouse met een etl-straatje, een centraal datawarehouse en één of twee bi-tools zijn geteld. We zullen nieuwe soorten data-platformen gaan zien waarin allerlei gegevens uit allerlei bronnen toegankelijk worden voor informatiewerkers en data scientists die allerlei tools gebruiken.

    Business intelligence in de cloud
    Waar vooral Nederlandse bedrijven nog steeds terughoudend zijn waar het de cloud betreft, zie je langzaam maar zeker dat de beweging richting cloud ingezet wordt. Steeds meer bedrijven realiseren zich dat met name security in de cloud vaak beter geregeld is dan dat ze zelf kunnen regelen. Ook cloud leveranciers doen steeds meer om Europese bedrijven naar hun cloud te krijgen. De nieuwe data centra van Microsoft in Duitsland waarbij niet Microsoft maar Deutsche Telekom de controle en toegang tot klantgegevens regelt, is daar een voorbeeld van. 2016 kan wel eens hét jaar worden waarin de cloud écht doorbreekt en waarin we ook in Nederland steeds meer complete BI oplossingen in de cloud zullen gaan zien.

    Huub Hillege, principal data(base) management consultant bij Info-Shunt
    Big data
    De big data-hype zal zich nog zeker voortzetten in 2016 alleen het succes bij de bedrijven is op voorhand niet gegarandeerd. Bedrijven en pas afgestudeerden blijven elkaar gek maken over de toepassing. Het is onbegrijpelijk dat iedereen maar Facebook, Twitter en dergelijke data wil gaan ontsluiten terwijl de data in deze systemen hoogst onbetrouwbaar is. Op elke conferentie vraag ik waar de business case, inclusief baten en lasten is, die alle investeringen rondom big data rechtvaardigen. Zelfs bi-managers van bedrijven moedigen aan om gewoon te beginnen. Dus eigenlijk: achterom kijken naar de data die je hebt of kunt krijgen en onderzoeken of je iets vindt waar je iets aan zou kunnen hebben. Voor mij is dit de grootste valkuil, zoals het ook was met de start van Datawarehouses in 1992. Bedrijven hebben in de huidige omstandigheden beperkt geld. Zuinigheid is geboden.

    De analyse van big data moet op de toekomst zijn gericht vanuit een duidelijke business-strategie en een kosten/baten-analyse: welke data heb ik nodig om de toekomst te ondersteunen? Bepaal daarbij:

    • Waar wil ik naar toe?
    • Welke klantensegmenten wil ik erbij krijgen?
    • Gaan we met de huidige klanten meer 'Cross selling' (meer producten) uitvoeren?
    • Gaan we stappen ondernemen om onze klanten te behouden (Churn)?

    Als deze vragen met prioriteiten zijn vastgelegd moet er een analyse worden gedaan:

    • Welke data/sources hebben we hierbij nodig?
    • Hebben we zelf de data, zijn er 'gaten' of moeten we externe data inkopen?

    Databasemanagementsysteem
    Steeds meer databasemanagementsysteem (dbms)-leveranciers gaan ondersteuning geven voor big data-oplossingen zoals bijvoorbeeld Oracle/Sun Big Data Appliance, Teradata/Teradata Aster met ondersteuning voor Hadoop. De dbms-oplossingen zullen op de lange termijn het veld domineren. big data-software-oplossingen zonder dbms zullen het uiteindelijk verliezen.

    Steeds minder mensen, ook huidige dbma's, begrijpen niet meer hoe het technisch diep binnen een database/DBMS in elkaar zit. Steeds meer zie je dat fysieke databases uit logische data modelleer-tools worden gegeneerd. Formele fysieke database-stappen/-rapporten blijven achterwege. Ook ontwikkelaars die gebruik maken van etl-tools zoals Informatica, AbInitio, Infosphere, Pentaho et cetera, genereren uiteindelijk sgl-scripts die data van sources naar operationele datastores en/of datawarehouse brengen.

    Ook de bi-tools zoals Microstrategy, Business Objects, Tableau et cetera genereren sql-statements.
    Meestal zijn dergelijke tools initieel ontwikkeld voor een zeker dbms en al gauw denkt men dat het dan voor alle dbms'en toepasbaar is. Er wordt dan te weinig gebruik gemaakt van specifieke fysieke dbms-kenmerken.

    De afwezigheid van de echte kennis veroorzaakt dan performance problemen die in een te laat stadium worden ontdekt. De laatste jaren heb ik door verandering van databaseontwerp/indexen en het herstructureren van complexe/gegenereerde sql-scripts, etl-processen van zes tot acht uur naar één minuut kunnen krijgen en queries die 45 tot 48 uur liepen uiteindelijk naar 35 tot veertig minuten kunnen krijgen.

    Advies
    De benodigde data zal steeds meer groeien. Vergeet de aanschaf van allerlei hype software pakketten. Zorg dat je zeer grote, goede, technische, Database-/dbms-expertise in huis haalt om de basis van onderen goed in te richten in de kracht van je aanwezige dbms. Dan komt er tijd en geld vrij (je kan met kleinere systemen uit de voeten omdat de basis goed in elkaar zit) om, na een goede business case en ‘proof of concepts’, de juiste tools te selecteren.

  • 8 op de 10 bedrijven slaat gevoelige data op in de cloud

    54640085% van de bedrijven slaat gevoelige data op in de cloud. Dit is een flinke stijging ten opzichte van de 54% die vorig jaar aangaf dit te doen. 70% van de bedrijven maakt zich zorgen over de veiligheid van deze data.

    Dit blijkt uit onderzoek van 451 Research in opdracht van Vormetric, leverancier van databeveiliging voor fysieke, big data, public, private en hybride cloud omgevingen. Gevoelige data staat uiteraard niet alleen in de cloud. 50% van de bedrijven geeft aan gevoelige data in big data systemen te hebben staan (tegenover 31% vorig jaar), en 33% heeft dergelijke data in Internet of Things (IoT) omgevingen opgeslagen.

    Zorgen over de cloud
    451 Research heeft respondenten ook gevraagd naar de zorgen die zij hebben over de veiligheid van hun gevoelige data die in de cloud staat. De belangrijkste zorgenpunten zijn:

    • Cyberaanvallen en -inbraken bij een service provider (70%)
    • De kwetsbaarheid van een gedeelde infrastructuur (66%)
    • Een gebrek aan controle over de locatie waar data is opgeslagen (66%)
    • Een gebrek aan een data privacy beleid of privacy SLA (65%)

    Ook is respondenten gevraagd welke wijzigingen hun bereidheid data in de cloud onder te brengen zullen vergroten. De belangrijkste wijzigingen waar respondenten behoefte aan hebben zijn:

    • Encryptie van data, waarbij de encryptiesleutel wordt beheerd op de eigen infrastructuur van het bedrijf (48%)
    • Gedetaileerde informatie over de fysieke en IT-beveiliging (36%)
    • Het zelf kunnen kiezen voor encryptie van data die is opgeslagen op de infrastructuur van een service provider (35%)

    Zorgen over big data systemen
    Ook de opslag van gevoelige data in big data systemen baart respondenten zorgen. De belangrijkste zorgenpunten zijn:

    • De veiligheid van rapporten die met big data systemen worden gecreëerd, aangezien deze gevoelige data kunnen bevatten (42%)
    • Het feit dat data op iedere locatie binnen deze omgeving kan zijn ondergebracht (41%)
    • Privacyschendingen door data die uit verschillende landen afkomstig is (40%)Toegang door gebruikers met ‘superrechten’ tot beschermde data (37%)
    • Een gebrek aan een security raamwerk en beheermogelijkheden binnen de omgeving (33%)

    Ook merkt 451 Research op dat big data systemen vaak in de cloud draaien. Zorgen over de opslag van gevoelige data van de cloud zijn hierdoor ook van toepassing op data die in big data omgevingen is opgeslagen.

    Ook data in IoT omgevingen leidt tot zorgen
    Tot slot kijkt 451 Research naar de zorgen die bedrijven hebben over de opslag van data in IoT omgevingen. De belangrijkste zorgen op dit gebied zijn:

    • Het beschermen van data die door IoT wordt gecreëerd (35%)
    • Privacyschendingen (30%)
    • Identificeren welke data gevoelig is (29%)
    • Toegang van gebruikers met ‘superrechten’ tot IoT data en apparaten (28%)
    • Aanvallen op IoT-apparaten die een impact kunnen hebben op de kritieke bedrijfsvoering (27%)

    Het gehele onderzoek lees je HIER

    Source: Executive People

  • A cloud outsourcing analysis: should your company consider it?

    A cloud outsourcing analysis: should your company consider it?

    Learn about outsourcing cloud computing, its pros and cons, and what to ask a provider.

    There are many benefits to cloud computing, including cost savings, flexibility, and scalability, and it’s a popular choice for many businesses. In fact, over 50% of corporate data was stored with a cloud provider in 2021.

    The uses of cloud computing for businesses are numerous, from an initial transfer of data and services to automating company activities via the cloud. Many businesses prefer to outsource some of their cloud management capabilities in order to consolidate their IT resources.

    What is cloud outsourcing?

    A cloud outsourcing provider delivers managed cloud services for businesses. When a business chooses cloud outsourcing, they get all the benefits that come with cloud computing along with a dedicated team that manages their cloud services.

    These providers can design elastic architectures in the cloud and migrate your current infrastructure and legacy applications to the cloud for you. Once they have your business processes and systems in the cloud, your provider’s team will make sure they are all running at peak efficiency and handle maintenance tasks.

    The managed solution provider will assess the way you currently use technology in your business and create a custom solution using one, two, or all three types of cloud computing service models.

    Types of cloud computing service models

    Each cloud computing model provides different types of services. The model a cloud outsourcing provider chooses to implement depends on the legacy systems they are replacing and their customers’ technology needs.

    Here are the three main cloud computing service models:

    • Software-as-a-Service (SaaS): With SaaS, a business buys fully functional software from a cloud provider on a subscription basis. The business’ employees access the software via the internet, and the cloud provider manages all upkeep and maintenance.
    • Infrastructure-as-a-Service (IaaS): With IaaS, businesses purchase the infrastructure they need from a cloud provider via a subscription, and the infrastructure is delivered over the internet. These resources could include data storage, networking, servers, file storage, and more. The cloud services provider keeps all underlying hardware and software up to date.
    • Platform-as-a-Service (PaaS): With PaaS, a business rents a complete application development environment and tools from a cloud provider, which the business accesses via the internet.

    What’s the difference between traditional outsourcing and cloud outsourcing?

    Some businesses outsource marketing, accounting, and other functions (such as IT) to third parties who complete all associated tasks as if they were an extension of the client.

    With traditional IT outsourcing, businesses work with a third-party IT solutions provider that manages their data, servers, network, security, and more. The third-party is responsible for the functionality and upkeep of all services they supply. These providers are often known as managed service providers (MSP).

    Cloud outsourcing is similar, but there are some key differences. A cloud outsourcing services provider still manages a business’ IT solution, but they do so using cloud computing technology. Traditional MSPs may have their own data centers where they store client data.

    With cloud outsourcing, a business gets the best of two worlds. The upkeep, management, and configuration of the technology they use are handled by a third party, and their applications, data, networking, databases, and more are hosted with a cloud provider. According to one survey of businesses using cloud outsourcing, 96% of companies were satisfied with the results.

    Pros of cloud computing

    Cloud computing has many benefits; here are some your business can take advantage of by choosing a cloud-managed service provider (cloud outsourcing):

    Unlimited storage capacity

    With a traditional IT infrastructure, more storage requires more hardware, installation, and maintenance. In the cloud, this infrastructure is readily available because the cloud provider has planned ahead for growth. Adding more storage happens with the click of a button, or can even be automated based on needs.

    Automated backup/restoration of files and data

    Another thing you can automate in the cloud is backups. Each time data is stored or files are uploaded, a redundant copy can be created and stored securely. If anything ever happens to live production data or files, this copy can replace the original in a matter of seconds.

    Reduce infrastructure costs

    This is one of the main reasons businesses choose cloud outsourcing, as hardware is the responsibility of the cloud services provider. The end user doesn’t have to buy, maintain, or fix hardware-related issues, which can lower ongoing costs.

    Cons of cloud computing

    Most of the cons of cloud computing can be solved through the correct configuration of cloud resources, so with the right cloud services provider, they won’t be an issue. But it’s always better to be prepared, so here are a few to be aware of:

    DoS attacks, data loss, and theft

    When you store your data on a server you don’t control, you are trusting its security to a third party. You could also be just as susceptible to DoS attacks as you would be if you hosted your technology in-house.

    Fortunately, most data loss and theft on the cloud are due to misconfiguration, which often happens with users new to the cloud but rarely with experts (such as the staff of a managed services provider).

    Technology vulnerabilities, especially in shared environments

    There are some hazards involved with sharing a cloud provider. Unauthorized access via poor access restrictions and the abuse of employee credentials are among the most common cloud security problems.

    These issues usually occur when cloud resources are misconfigured or user passwords don’t comply with security standards, and can be mitigated by following best practices.

    Dependent on an internet connection

    Cloud computing does require an internet connection to work, meaning your service can be affected by internet outages. On the provider side, this is a rare concern as cloud service providers use redundant resources, and resources that fail are replaced on the fly.

    If the internet connection for a business using cloud services goes down, employees can often still connect to the services via a cellular data connection.

    How to choose a cloud service provider

    It can take time and work to migrate a business’s technology to the cloud, so it is important to vet providers and choose one that fits your business. Here are some topics to focus on, and questions to ask as you explore cloud outsourcing:

    • Service-level agreement (SLA): Will the service provider supply an SLA that guarantees the uptime of your system? What other details are included in the SLA? Make sure to examine it thoroughly.
    • Scheduled maintenance: How will your service provider handle scheduled maintenance and mission-critical events? What do they do if a data breach occurs?
    • Recovery and backup: Does the provider supply disaster recovery as a service? What is their continuity plan?
    • Security: The questions you ask your provider about cybersecurity are among the most important. How do they handle it? How big is their team? What type of physical security do they use at their data centers?
    • Compliance: If you have to store data that falls under compliance standards such as PCI or HIPAA, make sure your provider can store this data according to regulations.
    • Scalability: How does the provider handle changes in traffic and usage? Can you adjust it yourself, or will it scale automatically? If you experience usage surges, you will want to choose an auto-scaling option.
    • On-site services: After cloud migration, will some of your data, hardware, or software remain on-premise? If so, will the provider train your staff and show them how it integrates with the cloud infrastructure?
    • Customer service: The provider should be available when you need them. Do they have teams working 24/7 to handle issues and customer service requests?

    Author: Stephan Miller

    Source: Capterra

  • Amazon overtreft winstverwachingen dankzij cloud

    Amazon overtreft winstverwachingen dankzij cloud

    Amazon heeft dankzij een sterke groei in zijn cloud computing-afdeling betere kwartaalcijfers gepresenteerd dan verwacht. De omzet van het bedrijf steeg met 17% naar 59,7 miljard dollar (53,5 miljard euro). De nettowinst kwam uit op 3,56 miljard dollar, wat ruim twee keer zoveel is als de 1,6 miljard dollar een jaar eerder. De operationele inkomsten werden ook ruim verdubbeld naar 4,4 miljard dollar. 

    Daarmee zijn de verwachtingen van Wall Street ruim overtroffen volgens Silicon Angle. Analisten hadden gerekend op een nettowinst van 2,3 miljard dollar met een omzet van 59,65 miljard dollar.

    Cloud

    De meeste winst kwam opnieuw uit de cloud-afdeling van het bedrijf. De omzet uit de cloudsteeg met 41% tegenover een jaar eerder naar 7,7 miljard dollar. Dat is net iets meer dan analisten verwacht hadden. Het operationele inkomen steeg met 59% naar 2,22 miljard dollar.

    De omzetgroei van de afdeling is iets vertraagd ten opzichte van het vierde kwartaal, toen de groei nog 45% was. Volgens Chief Financial Officer (CFO) Brian Olsavsky is dat deels te wijten aan een erg goed kwartaal voor Amazon Web Services (AWS) vorig jaar. Daardoor zijn de vergelijkingen moeilijker.

    Daarnaast blijven de zaken op dit gebied volgens Olsavsky altijd wat moeilijker te voorspellen, omdat er onzekerheden bestaan over hoe snel bedrijven AWS en de cloud adopteren en operaties migreren vanuit hun eigen datacentra. Het klantgebruik van AWS-diensten blijft echter meer dan de omzetgroei.

    Advertenties

    Niet alleen de cloud-afdeling, maar ook de advertentie-afdeling van het bedrijf zag een flinke groei. Het “other”-segment van Amazon, dat vooral uit advertentieverkopen bestaat, zag de omzet met 34% stijgen naar 2,72 miljard dollar. Amazon is daarmee nu de derde grootste verkoper van digitale advertenties, achter Google en Facebook.

    Voor het huidige kwartaal heeft Amazon zijn verwachtingen bijgesteld. Het bedrijf verwacht een omzet tussen de 59,5 miljard en 63,5 miljard dollar, wat overeenkomt met de verwachting van Wall Street van 62,4 miljard dollar. Het verwachtte operationele inkomen tussen de 2,6 miljard en 3,6 miljard dollar is echter lager dan de verwachtingen van Wall Street. Die verwachten een operationeel inkomen van 4,19 miljard dollar.

    Bron: Techzine

  • Approaching the Transformation phase in ELT processes

    Approaching the Transformation phase in ELT processes

    Much has been written about the shift from ETL to ELT and how ELT enables superior speed and agility for modern analytics. One important move to support this speed and agility is creating a workflow that enables data transformation to be exploratory and iterative. Defining an analysis requires an iterative loop of forming and testing these hypotheses via data transformation. Reducing the latency of that interactive loop is crucial to reducing the overall time it takes to build data pipelines.

    ELT achieves flexibility by enabling access to raw data instead of predefined subsets or aggregates. The process achieves speed by leveraging the processing power of Cloud Data Warehouses and Data Lakes. A simple, dominant pattern is emerging: move all of your data to cost effective cloud storage and leverage cloud compute for transformation of data prior to analysis.

    What this takes:

    1. Extract your data from source systems
    2. Load your data into the cloud platform
    3. Transform your data in the cloud!

    There are a few different approaches to doing the transformation work as part of the ELT process.

    Code only solutions

    Individuals and teams proficient in languages such as SQL or Python can write transformations that run directly against the cloud data warehouse or data lake. Tools such as DBT and Dataform provide infrastructure around code to help teams build more maintainable pipelines. Code gives its authors ultimate flexibility to build anything the underlying system supports. Additionally, there are large communities of Python and SQL developers as well as a wealth of examples, best practices, and forums to learn from. Of course, there are many individuals in organizations that do not know how to write code or simply prefer not to but still need to efficiently transform data.

    Visual only solutions

    Individuals and teams that prefer visual transformation can leverage visual tools to build their pipelines. To gain the benefits of ELT, visual tools increasingly execute these pipelines directly  in Cloud Data Warehouses and Lakes instead of in proprietary run times. Visual solutions appeal to a large community of data pipeline developers who need to produce data assets but don’t necessarily want to or know how to code. These solutions often provide more automated approaches to build pipelines, increasing efficiency for many use cases. However, visual only approaches can at times be not as flexible as coding in the underlying system: certain use cases are not performant enough or simply not possible in the tool.

    Visual + code solutions

    We believe increasingly that modern data transformation tools will support both approaches to enable a broader community to collaborate and scale how this work is done in organizations. Use a visual approach where it makes sense but also enable users to leverage code where it makes sense. This best of both worlds approach has two major benefits:

    1. Increased efficiency for individuals: While some have strong preferences for doing their work in code or in a visual tool, we find that many people just want to get the job done as efficiently and effectively as possible. Each approach has advantages and disadvantages – providing flexibility allows an individual to choose the right tool for the job.
    2. Collaboration across the organization: Organizations have some users who prefer to code and some prefer not to. Solutions providing support for both have the potential to enable collaboration across users with varied skill sets and use cases.

    Approach at Trifacta
    Moving forward, Trifacta is increasingly investing in the two areas to enable collaboration across teams and individuals working in both code + user interfaces:

    1. Running code within Trifacta: Most of Trifacta's customers primarily leverage it's visual interface to build transformations, but many of them also use the existing functionality for running SQL queries from directly within Trifacta, building pipelines with SQL and/or our internal DSL. Soon, Trifacta plans to support other languages such as Python.
    2. Generating code with Trifacta: The pipelines customers build in Trifacta are built on top of our internal transformation language. This language is then translated into code that can run across a variety of different platforms. Today Spark, Google Dataflow, Photon (Trifacta's engine for in-browser computation) are supported and common transformations pushed down into databases and cloud data warehouses like Snowflake, Big Query and Redshift. To date, this code is ran on behalf of the customers, but Trifacta received many requests to take the code that is generated and use that completely outside of Trifacta.  

    Author: Sean Kandel

    Source: Trifacta

  • Best practices for cloud consulting: from overall strategy to attention for detail

    Best practices for cloud consulting: from overall strategy to attention for detail

    Do you know that the average person uses 36 cloud-based services every single day? Many businesses have naturally moved or are considering cloud computing for benefits like efficiency, knowledge, enhanced security, and reduced costs. So, there’s no denying that cloud computing has completely changed our lives and the way we conduct businesses.

    Still, many organizations get overwhelmed by the challenges lying ahead. Those challenges mainly include incompatibility issues, security risks, lack of expertise, and governance & compliance issues. And to tackle those challenges, the demand for cloud consulting services has grown dramatically; the global cloud infrastructure services spending increased to $55 billion in Q1 2022.

    The cloud consulting market is projected to reach $22 billion by 2022. However, as important as technical capabilities are skills necessary to work effectively with your clients. Here are seven must-know best practices for cloud consultants:

    1. Start with a Strategy

    This is where you and your team begin to outline a plan. You need to understand what the goals of your business are, who your target audience is and what you intend on selling them. Once these things are defined, you can move on to developing your brand. Branding is an important part of any business’s foundation because it helps define who you are and how others see you. In addition to this, it helps guide decision-making by giving employees something they can refer back to when making decisions that may affect how customers view their company as a whole (i.e. if something goes wrong).

    • Define Your Target Audience: Who is going to be buying from me? What do they look like? How old are they? What gender do I want my brand to appeal to? These questions will help shape where we go from here with our marketing strategy for targeting this particular demographic group.
    • Define Your Product Or Service: This one seems pretty obvious but there’s more to defining your product than just saying “I’m selling fruit.” You need specific details about what kinds of fruits or vegetables; whether its organic vs conventional; how much does each type cost per pound; etc…The better defined this information becomes now than later down the road when someone starts asking questions about why things aren’t working out quite right in terms of sales success rates etc…You’ll have answers ready at hand instead of scrambling around trying to figure out why certain tactics aren’t working anymore!

    2. Be Patient

    Cloud consulting is a long-term game. It takes time to build a successful cloud consulting business, team and practice. It also takes time to create a client base that attracts the right clients and projects for your business. Because of this, any cloud consultant who tells you they’ve hit their stride in six months or less is most likely lying through their teeth (or at least not telling you the whole truth).

    3. Be Prepared for Growing Pains

    Cloud consulting is an exciting and growing industry with a lot of potential. As the cloud continues to evolve and grow, it will create new problems that you need to be prepared for. The cloud is still a relatively new technology, so it’s best not to expect everything will go smoothly from day one.

    4. Testing, testing and more testing

    Testing is the most important aspect of the whole process. You can have a great team, but without proper testing, you will not be able to deliver quality products. Testing your applications on different platforms is just as important as testing them in different environments and browsers. Mobile devices are becoming more and more popular, so we might want to test our applications on various mobile devices as well. There are also many operating systems that need to be taken into consideration when you do your development work. For instance, if you are developing an application for Windows and Mac OS X users only (or vice versa), then make sure that your product works smoothly under both operating systems and does not crash at any point during its usage.

    5. Focus on automation

    The most important thing a cloud consultant can do to improve their business is to automate processes. Automation will help you scale your business, reduce errors, cut costs and improve customer experience. Automation helps you scale by enabling your employees to focus on higher-value tasks that require more specialized knowledge or skill. It also allows you to provide better service at a lower cost because it reduces the need for manual work that takes up time and resources. Improving customer service is one of the keys to being successful as a consulting firm; however, this can be difficult when customer satisfaction depends on the availability of human resources with expertise in certain areas such as compliance or security. With automation tools like Zapier automating repetitive tasks like sending out invoices after each project ends automatically so there’s no need for manual labor which frees up staff members who are better suited at providing value through other means (such as analyzing data or developing new products/services).

    6. Cloud consulting is in high demand

    Cloud consulting is the perfect career if you’re looking to learn new skills, earn money and build respect. Cloud consulting is in high demand right now. In fact, according to Glassdoor’s Best Jobs 2019 report, Cloud Solutions Architect was one of the top jobs with a median base salary of $100K per year.If you want to become a cloud consultant, there are two main types: technical and business-focused. Technical consultants will work on projects related specifically to cloud technologies like AWS or Azure while business consultants may help clients understand how their organization can benefit from moving some workloads off-premises into the public cloud or private cloud.

    7. Have eye for detail

    People who do well in cloud consulting have an eye for detail, love to solve problems and enjoy building new relationships.

    • You need to be detail-oriented. If you can’t handle the small stuff, it’s not going to work.
    • You need to be able to solve problems and troubleshoot problems quickly. You will get asked a lot of questions, so being able to come up with an answer right away is key.
    • You need good communication skills that allow for building relationships with new people and vendors/partners in your industry as well as existing customers who may have been with your company for years, if not decades!
    • Learning new things shouldn’t scare you-it should excite you! The more you know about different technologies and tools available today, the better equipped you will be when someone comes along with an idea they want to be implemented but aren’t sure how yet.

    Conclusion

    Cloud consulting is a great niche to be in right now as more companies are looking for ways to reduce costs and increase efficiency. This can be done through the use of technologies like virtualization or containerization, which allow you to run multiple applications on one server instead of just one application per server. The best part about cloud consulting services is that they provide flexibility so that you can tailor your services based on what works best for your client’s needs without losing any functionality.

  • Better analytics must address cloud computing's remaining challenges

     

    Without proper analytics in place, many cloud services customers are wasting resources, struggling with compliance and suffering from outages and unexpected costs, according to a new study from Forrester Research.

    The study, sponsored by enterprise cloud hosting provider iLand, shows that all of the 275 IT decision makers and senior business executives surveyed in the United States, United Kingdom and Singapore said they’ve experienced at least one negative financial or operational impact due to missing or hidden metadata. These negative business impacts include outages, wasted resources, unexpected costs and challenges reporting to management.

    “Companies aren’t just using the cloud—they depend on it,” the report says. “Nevertheless, cloud providers fail to keep cloud users happy. As companies expand their use of cloud services, they need to be confident that their cloud providers aren’t holding anything back, and are committed to their success.”

    Other findings of the report are that overall, cloud customer satisfaction is low, with a strong sentiment that providers don’t respond to customers’ needs. More than half of the respondents said their provider does not understand their company’s needs or care about their success.

    Forrester recommends that companies considering cloud services evaluate the native tools delivered by the cloud platform to ensure they deliver visibility, alerting and analytics; demand clarity about compliance data, on-call experts and straightforward processes from your cloud provider; and look for a cloud with onboarding and support teams staffed by experts

    Auteur: Bob Violino

    Bron: Information Management

  • BI topics to tackle when migrating to the cloud

    BI topics to tackle when migrating to the cloud

    When your organization decides to pull the trigger on a cloud migration, a lot of stuff will start happening all at once. Regardless of how long the planning process has been, once data starts being relocated, a variety of competing factors that have all been theoretical earlier become devastatingly real: frontline business users still want to be able to run analyses while the migration is happening, your data engineers are concerned with the switch from whatever database you were using before, and the development org has its own data needs. With a comprehensive, BI-focused data strategy, you and your stakeholders will know what your ideal data model should look like once all your data is moved over. This way, as you’re managing the process and trying to keep everyone happy, you end in a stronger place when your migration is over than you were at the start, and isn’t that the goal?

    BI focus and your data infrastructure

    “What does all this have to do with my data model?” you might be wondering. “And for that matter, my BI solution?”

    I’m glad you asked, internet stranger. The answer is everything. Your data infrastructure underpins your data model and powers all of your business-critical IT systems. The form it takes can have immense ramifications for your organization, your product, and the new things you want to do with it. Your data infrastructure is hooked into your BI solution via connectors, so it’ll work no matter where the data is stored. Picking the right data model, once all your data is in its new home, is the final piece that will allow you to get the most out of it with your BI solution. If you don’t have a BI solution, the perfect time to implement one is once all your data is moved over and your model is built. This should all be part of your organization’s holistic cloud strategy, with buy-in from major partners who are handling the migration.

    Picking the right database model for you

    So you’re giving your data a new home and maybe implementing a BI solution when it’s all done. Now, what database model is right for your company and your use case? There are a wide array of ways to organize data, depending on what you want to do with it.

    One of the broadest is a conceptual model, which focuses on representing the objects that matter most to the business and the relationships between them. This database model is designed principally for business users. Compare this to a physical model, which is all about the structure of the data. In this model, you’ll be dealing with tables, columns, relationships, graphs, etc. And foreign keys, which distinguish the connections between the tables.

    Now, let’s say you’re only focused on representing your data organization and architecture graphically, putting aside the physical usage or database management framework. In cases like these, a logical model could be the way to go. Examples of these types of databases include relational (dealing with data as tables or relations), network (putting data in the form of records), and hierarchical (which is a progressive tree-type structure, with each branch of the tree showing related records). These models all feature a high degree of standardization and cover all entities in the dataset and the relationships between them.

    Got a wide array of different objects and types of data to deal with? Consider an object-oriented database model, sometimes called a “hybrid model.” These models look at their contained data as a collection of reusable software pieces, all with related features. They also consolidate tables but aren’t limited to the tables, giving you freedom when dealing with lots of varied data. You can use this kind of model for multimedia items you can’t put in a relational database or to create a hypertext database to connect to another object and sort out divergent information.

    Lastly, we can’t help but mention the star schema here, which has elements arranged around a central core and looks like an asterisk. This model is great for querying informational indexes as part of a larger data pool. It’s used to dig up insights for business users, OLAP cubes, analytics apps, and ad-hoc analyses. It’s a simple, yet powerful, structure that sees a lot of usage, despite its simplicity.

    Now what?

    Whether you’re building awesome analytics into your app or empowering in-house users to get more out of your data, knowing what you’re doing with your data is key to maintaining the right models. Once you’ve picked your database, it’s time to pick your data model, with an eye towards what you want to do with it once it’s hooked into your BI solution.

    Worried about losing customers? A predictive churn model can help you get ahead of the curve by putting time and attention into relationships that are at risk of going sour. On the other side of the coin, predictive up- and cross-sell models can show you where you can get more money out of a customer and which ones are ripe to deepen your financial relationship.

    What about your marketing efforts? A customer segmentation data model can help you understand the buying behaviors of your current customers and target groups and which marketing plays are having the desired effect. Or go beyond marketing with “next-best-action models” that take into account life events, purchasing behaviors, social media, and anything else you can get your hands on so that you can figure out what’s the next action with a given target (email, ads, phone call, etc.) to have the greatest impact. And predictive analyses aren’t just for humancentric activities, manufacturing and logistics companies can take advantage of maintenance models that can let you circumvent machine breakdowns based on historical data. Don’t get caught without a vital piece of equipment again.

    Bringing it all together with BI

    Staying focused on your long-term goals is an important key to success. Whether you’re building a game-changing product or rebuilding your data model, having a well defined goal makes all the difference in the world when it comes to the success of your enterprise. If you’re already migrating your data to the cloud, then you’re at the perfect juncture to pick the right database and data models for your eventual use cases. Once these are set up, they’ll integrate seamlessly with your BI tool (and if you don’t have one yet, it’ll be the perfect time to implement one). Big moves like this represent big challenges, but also big opportunities to make lay the foundation for whatever you’re planning on building. Then you just have to build it!

    Author: Jack Cieslak

    Source: Sisense

  • Big Data on the cloud makes economic sense

    With Big Data analytics solutions increasingly being made available to enterprises in the cloud, more and more companies will be able to afford and use them for agility, efficiency and competitiveness

    google
    For almost 10 years, only the biggest of technology firms such as Alphabet Inc.’s Google and Amazon.com Inc.
    used data analytics on a scale that justified the idea of ‘big’ in Big Data. Now more and more firms are
    warming up to the concept. Photo: Bloomberg

    On 27 September, enterprise software company SAP SE completed the acquisition of Altiscale Inc.—a provider of Big Data as-a-Service (BDaaS). The news came close on the heels of data management and analytics company Cloudera Inc. and data and communication services provider CenturyLink Inc. jointly announcing BDaaS services. Another BDaaS vendor, Qubole Inc., said it would offer a big data service solution for the Oracle Cloud Platform.

    These are cases in point of the growing trend to offer big data analytics using a cloud model. Cloud computing allows enterprises to pay for software modules or services used over a network, typically the Internet, on a monthly or periodical basis. It helps firms save relatively larger upfront costs for licences and infrastructure. Big Data analytics solutions enable companies to analyse multiple data sources, especially large data sets, to take more informed decisions.

    According to research firm International Data Corporation (IDC), the global big data technology and services market is expected to grow at a compound annual growth rate (CAGR) of 23.1% over 2014-2019, and annual spending is estimated to reach $48.6 billion in 2019.

    With Big Data analytics solutions increasingly being made available to enterprises in the cloud, more and more companies will be able to afford and use them for agility, efficiency and competitiveness.

    MarketsandMarkets, a research firm, estimates the BDaaS segment will grow from $1.8 billion in 2015 to $7 billion in 2020. There are other, even more optimistic estimates: research firm Technavio, for instance, forecasts this segment to grow at a CAGR of 60% from 2016 to 2020.

    Where does this optimism stem from?

    For almost 10 years, it was only the biggest of technology firms such as Alphabet Inc.’s Google and Amazon.com Inc., that used data analytics on a scale that justified the idea of ‘big’ in Big Data. In industry parlance, three key attributes are often used to understand the concept of Big Data. These are volume, velocity and variety of data—collectively called the 3Vs.

    Increasingly, not just Google and its rivals, but a much wider swathe of enterprises are storing, accessing and analysing a mountain of structured and unstructured data. The trend is necessitated by growing connectivity, falling cost of storage, proliferation of smartphones and huge popularity of social media platforms—enabling data-intensive interactions not only among ‘social friends’ but also among employers and employees, manufacturers and suppliers, retailers and consumers—virtually all sorts of connected communities of people.

    g tech web
     
    A November 2015 IDC report predicts that by 2020, organisations that are able to analyse all relevant data and deliver actionable information will achieve an extra $430 billion in productivity benefits over their less analytically oriented peers.

    The nascent nature of BDaaS, however, is causing some confusion in the market. In a 6 September article onNextplatform.com, Prat Moghe, founder and chief executive of Cazena—a services vendor—wrote that there is confusion regarding the availability of “canned analytics or reports”. According to him, vendors (solutions providers) should be carefully evaluated and aspects such as moving data sets between different cloud and on-premises systems, ease of configuration of the platform, etc., need to be kept in mind before making a purchase decision.

    “Some BDaaS providers make it easy to move datasets between different engines; others require building your own integrations. Some BDaaS vendors have their own analytics interfaces; others support industry-standard visualization tools (Tableau, Spotfire, etc.) or programming languages like R and Python. BDaaS vendors have different approaches, which should be carefully evaluated,” he wrote.

    Nevertheless, the teething troubles are likely to be far outweighed by the benefits that BDaaS brings to the table. The key drivers, according to the IDC report cited above, include digital transformation initiatives being undertaken by a lot of enterprises; the merging of real life with digital identity as all forms of personal data becomes available in the cloud; availability of multiple payment and usage options for BDaaS; and the ability of BDaaS to put more analytics power in the hands of business users.

    Another factor that will ensure growth of BDaaS is the scarcity of skills in cloud as well as analytics technologies. Compared to individual enterprises, cloud service providers such as Google, Microsoft Corp., Amazon Web Services and International Businsess Machines Corp. (IBM) can attract and retain talent more easily and for longer durations.

    Manish Mittal, managing principal and head of global delivery at Axtria, a medium-sized Big Data analytics solutions provider, says the adoption of BDaaS in India is often driven by business users. While the need is felt by both chief information officers and business leaders, he believes that the latter often drive adoption as they feel more empowered in the organisation.

    The potential for BDaaS in India can be gauged from Axtria’s year-on-year business growth of 60% for the past few years—and there are several niche big data analytics vendors currently operating in the country (besides large software companies).

    Mittal says that the growth of BDaaS adoption will depend on how quickly companies tackle the issue of improving data quality.

    Source: livemint.com, October 10, 2016
     

     

  • Big Data Predictions for 2016

    A roundup of big data and analytics predictions and pontifications from several industry prognosticators.

    At the end of each year, PR folks from different companies in the analytics industry send me predictions from their executives on what the next year holds. This year, I received a total of 60 predictions from a record 17 companies. I can't laundry-list them all, but I can and did put them in a spreadsheet (irony acknowledged) to determine the broad categories many of them fall in. And the bigger of those categories provide a nice structure to discuss many of the predictions in the batch.

    Predictions streaming in
    MapR CEO John Shroeder, whose company just added its own MapR Streams component to its Hadoop distribution, says "Converged Approaches [will] Become Mainstream" in 2016. By "converged," Schroeder is alluding to the simultaneous use of operational and analytical technologies. He explains that "this convergence speeds the 'data to action' cycle for organizations and removes the time lag between analytics and business impact."

    The so-called "Lambda Architecture" focuses on this same combination of transactional and analytical processing, though MapR would likely point out that a "converged" architecture co-locates the technologies and avoids Lambda's approach of tying the separate technologies together.

    Whether integrated or converged, Phu Hoang, the CEO of DataTorrent predicts 2016 will bring an ROI focus to streaming technologies, which he summarizes as "greater enterprise adoption of streaming analytics with quantified results." Hoang explains that "while lots of companies have already accepted that real-time streaming is valuable, we'll see users looking to take it one step further to quantify their streaming use cases."

    Which industries will take charge here? Hoang says "FinTech, AdTech and Telco lead the way in streaming analytics." That makes sense, but I think heavy industry is, and will be, in a leadership position here as well.

    In fact, some in the industry believe that just about everyone will formulate a streaming data strategy next year. One of those is Anand Venugopal of Impetus Technologies, who I spoke with earlier this month. Venugopa, in fact, feels that we are within two years of streaming data becoming looked upon just another data source.

    Internet of predicted things
    It probably won't shock you that the Internet of Things (IoT) was a big theme in this year's round of predictions. Quentin Gallivan, Pentaho's CEO, frames the thoughts nicely with this observation: "Internet of Things is getting real!" Adam Wray, CEO at Basho, quips that "organizations will be seeking database solutions that are optimized for the different types of IoT data." That might sound a bit self-serving, but Wray justifies this by reasoning that this will be driven by the need to "make managing the mix of data types less operationally complex." That sounds fair to me.

    Snehal Antani, CTO at Splunk, predicts that "Industrial IoT will fundamentally disrupt the asset intelligence industry." Suresh Vasudevan, the CEO of Nimble Storage proclaims "in 2016 the IoT invades the datacenter." That may be, but IoT technologies are far from standardized, and that's a barrier to entry for the datacenter. Maybe that's why the folks at DataArt say "the IoT industry will [see] a year of competition, as platforms strive for supremacy." Maybe the data center invasion will come in 2017, then.

    Otto Berkes, CTO at CA Technologies, asserts that "Bitcoin-born Blockchain shows it can be the storage of choice for sensors and IoT." I hardly fancy myself an expert on blockchain technology, so I asked CA for a little more explanation around this one. A gracious reply came back, explaining that "IoT devices using this approach can transact directly and securely with each other...such a peer-to-peer configuration can eliminate potential bottlenecks and vulnerabilities." That helped a bit, and it incidentally shines a light on just how early-stage IoT technology still is, with respect to security and distributed processing efficiencies.

    Growing up
    Though admittedly broad, the category with the most predictions centered on the theme of value and maturity in Big Data products supplanting the fascination with new features and products. Essentially, value and maturity are proxies for the enterprise-readiness of Big Data platforms.

    Pentaho's Gallivan says that "the cool stuff is getting ready for prime time." MapR's Schroeder predicts "Shiny Object Syndrome Gives Way to Increased Focus on Fundamental Value," and qualifies that by saying "...companies will increasingly recognize the attraction of software that results in business impact, rather than focusing on raw big data technologies." In a related item, Schroeder predicts "Markets Experience a Flight to Quality," further stating that "...investors and organizations will turn away from volatile companies that have frequently pivoted in their business models."

    Sean Ma, Trifacta's Director of Product Management, looking at the manageability and tooling side of maturity, predicts that "Increasing the amount of deployments will force vendors to focus their efforts on building and marketing management tools." He adds: "Much of the capabilities in these tools...will need to replicate functionality in analogous tools from the enterprise data warehouse space, specifically in the metadata management and workflow orchestration." That's a pretty bold prediction, and Ma's confidence in it may indicate that Trifacta has something planned in this space. But even if not, he's absolutely right that this functionality is needed in the Big Data world. In terms of manageability, Big Data tooling needs to achieve not just parity with data warehousing and BI tools, but needs to surpass that level.

    The folks at Signals say "Technology is Rising to the Occasion" and explain that "advances in artificial intelligence and an understanding [of] how people work with data is easing the collaboration between humans and machines necessary to find meaning in big data." I'm not sure if that is a prediction, or just wishful thinking, but it certainly is the way things ought to be. With all the advances we've made in analyzing data using machine learning and intelligence, we've left the process of sifting through the output a largely manual process.

    Finally, Mike Maciag, the COO at AltiScale, asserts this forward-looking headline: "Industry standards for Hadoop solidify." Maciag backs up his assertion by pointing to the Open Data Platform initiative (ODPi) and its work to standardize Hadoop distributions across vendors. ODPi was originally anchored by Hortonworks, with numerous other companies, including AltiScale, IBM and Pivotal, jumping on board. The organization is now managed under the auspices of the Linux Foundation.

    Artificial flavor
    Artificial Intelligence (AI) and Machine Learning (ML) figured prominently in this year's predictions as well. Splunk's Antani reasons that "Machine learning will drastically reduce the time spent analyzing and escalating events among organizations." But Lukas Biewald, Founder and CEO of Crowdflower insists that "machines will automate parts of jobs -- not entire jobs." These two predictions are not actually contradictory. I offer both of them, though, to point out that AI can be a tool without being a threat.

    Be that as it may, Biewald also asserts that "AI will significantly change the business models of companies today." He expands on this by saying "legacy companies that aren't very profitable and possess large data sets may become more valuable and attractive acquisition targets than ever." In other words, if companies found gold in their patent portfolios previously, they may find more in their data sets, as other companies acquire them to further their efforts in AI, ML and predictive modeling.

    And more
    These four categories were the biggest among all the predictions but not the only ones, to be sure. Predictions around cloud, self-service, flash storage and the increasing prominence of the Chief Data Officer were in the mix as well. A number of predictions that stood on their own were there too, speaking to issues as far-reaching as salaries for Hadoop admins to open source, open data and container technology.

    What's clear from almost all the predictions, though, is that the market is starting to take basic big data technology as a given, and is looking towards next-generation integration, functionality, intelligence, manageability and stability. This implies that customers will demand certain baseline data and analytics functionality to be part of most technology solutions going forwards. And that's a great sign for everyone involved in Big Data.

    Source: ZDNet

     

  • Business Intelligence and beyond: predictions for 2016

    businessIt’s been an interesting year for BI – and 2016 looks set to be no different

    Here are some predictions on what we believe next year has in store, in particular for the data and analytics industry.

    1. Cannibalisation of the channel
    Next year will see many vendors looking to take back control, rather than invest in their channel partners. The danger for the channel is that this will result in vendors keeping good deals or redirecting services projects back to themselves. Platforms such as Amazon Web Services and Microsoft Azure have grown exponentially this year. Another risk is the continued trend of vendors developing hosted solutions, via platforms such as these that cut out their channel partners. In response to this, the channel needs to look for vendors with a transparent indirect strategy in place and form mutually beneficially relationships.

  • CIO bij overheid investeert vooral in cloud en security

    Logo overheid

    Cio's in de publieke sector hebben voor de investeringsagenda van 2018 cloudoplossingen, security en data-analyse bovenaan hun prioriteitenlijst staan. Ook business intelligence (bi) en datamanagement scoren hoog. Anders dan in andere sectoren zijn de cio's in de publieke sector nog nauwelijks bezig met kunstmatige intelligentie en IoT. 

    Dat blijkt uit cijfers van analistenbureau Gartner. Daaraan deden 461 cio's mee van nationale, federale en lokale overheden, defensie en inlichtingendientsen uit 98 landen. De uitkomsten zijn onderdeel van de 2018 CIO Agenda Survey, onder bijna 3200 cio's.

    De cio's binnen de publieke sector noemen: 'cloud solutions', 'cybersecurity' en 'analytics' als belangrijkste onderdelen van de ict-omgeving waarin in 2018 de meeste investeringen zullen plaatsvinden. Volgens de respondenten wordt de datacenterinfrastructuur het vaakst aangewezen als plek om kosten te besparen. Als belangrijkste prioriteit voor het verbeteren van de bedrijfsvoering wordt 'digital transformation' genoemd.

    Gartner legt de uitkomsten van de cio's in de publieke sector ook naast die van de hoogste ict-verantwoordelijken in alle andere sectoren. Daaruit blijkt dat sommige uitkomsten sterk verschillen. Voorbeelden is: 'artificial intelligence' (ai), ofwel kunstmatige intelligentie. Dat onderdeel ei

    ndigt in de algemene ranglijst binnen de top tien, maar onder de cio's van de publieke sector eindigt het op de negentiende plaats. Cio's binnen defensie en inlichtingendiensten vormen een uitzondering.

    Ook internet of things (IoT) heeft nog nauwelijks prioriteit op de investeringsagenda van cio's in de publieke sector. Hoewel het in het gemiddelde van cio's uit verschillende sectoren eindigt in de top-tien, komt IoT op de prioriteitenlijst van de cio's in de publieke sector uit op een twaalfde plek. 

    Smart city's

    Alleen lokale overheden (gemeenten) vormen een uitzondering doordat die cio's soms smart city-projecten ontplooien. Dat geldt ook weer voor cio's bij defensie en inlichtingendiensten. Maar over het algemeen is het monitoren van datastromen afkomstig van sensoren nog niet ver doorgedrongen op de lijst van prioriteitenlijst voor ict-investeringen bij de overheid.

    Top New Tech Spending

    Rank Government Priorities % Respondents
    1 Cloud services/solutions 19%
    2 Cyber/information security  17%
    3 BI/analytics 16%
    4 Infrastructure/data centre 14%
    5 Digitalisation/digital marketing 7%
    6 Data management 6%
    7 Communications/connectivity 6%
    8 Networking, voice/data communications 6%
    9 Application development 5%
    10 Software — development or upgrades 5%

    Bron: Gartner (januari 2018)

     

    Top Tech to achieve organisation's mission

     

    Rank Government Priorities % Respondents
    1 Cloud services/solutions 19%
    2 BI/analytics 18%
    3 Infrastructure/data centre 11%
    4 Digitalisation/digital marketing 6%
    5 Customer relationship management 5%
    6 Security and risk 5%
    7 Networking, voice and data communications 4%
    8 Legacy modernisation 4%
    9 Enterprise resource planning 4%
    10 Mobility/mobile applications 3%

    Bron: Gartner (Januari 2018)

  • Data, Analytics & Fuel Innovation at Celgene

    Williams-Richard-CelgeneCIO Richard Williams leads a global IT organization that’s harnessing digital, data, and analytics to support R&D innovation, drive operational excellence, and help Celgene achieve first-mover advantage in the shift to value-based, personalized health care intended to help patients live longer and healthier lives.
     
     
    An explosion of electronic health information is rocking the entire health care ecosystem, threatening to transform or disrupt every aspect of the industry. In the biopharmaceutical sector, that includes everything from the way breakthrough scientific innovations and insights occur to clinical development, regulatory approvals, and reimbursement for innovations. Celgene, the $11 billion integrated global biopharmaceutical company, is no exception.
     
    Indeed, Celgene, whose mission is to discover, develop, and commercialize innovative therapies for the treatment of cancer, immune-inflammatory, and other diseases, is aggressively working to leverage the information being generated across the health care system, applying advanced analytics to derive insights that power its core business and the functions that surround and support it. Long known for its commitment to external scientific collaboration as a source of innovation, Celgene is investing to harness not only the data it generates across the enterprise, but also the real-world health care data generated by its expanding network of partners. Combined, this network of networks is powering tremendous value.
     
    CIO Richard Williams sees his mission—and that of the IT organization he leads—as providing the platforms, data management, and analytics capabilities to support Celgene through the broader industry transition to value-based, personalized health care. At Celgene, this transformation is enabled by a focus on the seamless integration of information and technology. A cloud-first platform strategy, coupled with enterprise information management, serves as the foundation for leveraging the data generated and the corresponding insights from internal and external health care data.
     
    Williams recently shared his perspective on the changes wrought by enormous data volumes in health care, the role of IT at Celgene, and the ways IT supports life sciences innovation.
     
    Can you describe the environment in which Celgene is currently operating?
     
    Williams: We are living in an exciting era of scientific breakthroughs coupled with technology convergence. This creates both disruption and opportunity. The explosion and availability of data, the cloud, analytics, mobility, artificial intelligence, cognitive computing, and other technologies are accelerating data collection and insight generation, opening new pathways for collaboration and innovation. At Celgene, we’re able to apply technology as never before—in protein homeostasis, epigenetics, immuno-oncology, immuno-inflammation, informatics, and other fields of study—to better understand disease and develop targeted therapies and treatments for people who desperately need them.
     
    How does IT support scientific and business innovation at Celgene?
     
    At its core, Celgene IT is business aligned and value focused. Rather than looking at technology for technology’s sake, we view information and technology as essential to achieving our mission and business objectives. As an integrated function, we have end-to-end visibility across the value chain. This enables us to identify opportunities to leverage technology investments to connect processes and platforms across all functions. As a result, we’re able to support improvements in R&D productivity, product launch effectiveness, and overall operational excellence.
     
    This joint emphasis on business alignment and business value, which informs everything we do, is manifest in three important ways:
     
    First is our emphasis on a core set of enterprise platforms, which enable us to provide end-to-end visibility rather than a narrower functional view. We established a dual information- and cloud-first strategy to provide more comprehensive platforms of capabilities that can be shared across Celgene’s businesses. The cloud—especially with recent advances in security and analytics—provides tremendous scale, agility, and value because it allows us to standardize and create both consistency and agility across the entire organization regardless of device or access method. It’s our first choice for applications, compute power, and storage.
     
    Second is our focus on digital and the proliferation of patient, consumer, scientific, and it is creating. Health care data is growing exponentially—from something like 500 petabytes (PB) of data in 2013 to 25,000 PB by 2020, according to one study.
     
    To address this opportunity, we’ve initiated an enterprise information management (EIM) strategy through which we are targeting important data domains across our business and applying definitions, standards, taxonomies, and governance to data we capture internally and from our external partners. Establishing that consistency is critically important. It drives not only innovation, but also insight into our science, operations, and, ultimately, patient outcomes. Celgene is at the forefront in leveraging technologies that offer on-demand compute and analytic services. By establishing data consistency and influencing and setting standards, we will support our own objectives while also benefiting the broader industry.
     
    Third is our support for collaboration—the network of networks—and the appropriate sharing of information across organizational boundaries. We want to harness the capabilities and data assets of our partners to generate insights that improve our science and our ability to get better therapies to patients faster. Celgene is well-known in the industry for external innovation—how we partner scientifically—and we are now extending this approach to data and technology collaboration. One recent example is our alliance with Medidata Solutions, whose Clinical Cloud will serve as our enterprise technology and data platform for Celgene clinical trials worldwide. Celgene is also a founding commercial member of the Oncology Research Information Exchange Network, a collaboration of cancer centers spearheaded by M2Gen, a health informatics solution company. And we have teamed with ConvergeHEALTH by Deloitte and several other organizations for advanced analytics around real-world evidence and knowledge management, which will also be integrated into our data platform.
     
    You’re building this network-enabled, data-rich environment. But are your users prepared to take advantage of it?
     
    That’s an important aspect of the transformation and disruption taking place across multiple industries. Sure, IT can make information, technology, and insights available for improved decision-making, but the growing complexity of the data—whether it’s molecular structures, genomics, electronic medical records, or payment information—demands different skill sets.
     
    Data scientists are in high demand. We need to embed individuals with those specialized skills in functions from R&D to supply chain and commercial. At the same time, many more roles will require analytics acumen as part of the basic job description.
     
    As you build out your platform and data strategies, are you likely to extend those to your external alliances and partners?
     
    External collaboration enabled by shared data and analytics platforms is absolutely part of our collaboration strategy. If our informatics platforms can help our academic or commercial biotech collaborators advance the pace of their scientific evaluations, clinical studies, and commercialization, or they can help us with ours, that’s a win-win situation—and a differentiator for Celgene. We are already collaborating with Sage Bionetworks, leveraging Apple ResearchKit to develop an app that engages patients directly in innovation aimed at improving treatments for their diseases. We’re also working with IBM Watson to increase patient safety using cognitive computing to improve drug monitoring. As the power of collaborative innovation continues, collaboration will become more commonplace and lead to some amazing results.
     
    As you look out 12 to 18 months, what technologies might you want to bolt onto this platform or embed in your EIM strategy?
     
    The importance of cognitive computing, including machine learning and artificial intelligence, will continue to grow, helping us to make sense of the increasing volumes of data. The continued convergence of these technologies with the internet of things and analytics is another area to watch. It will result in operational insights as well as new, more intelligent ways to improve treatments for disease.
     
    What advice do you have for CIOs in health care or other industries who may not be as far along in their cloud, data, and analytics journeys?
    A digital enterprise is a knowledge- and information-driven enterprise, so CIOs should first focus on providing technologies and platforms that support seamless information sharing. In the process, CIOs should constantly be looking at information flows through an enterprise lens—real value is created when information is connected across all functions. Next, it’s increasingly important for CIOs to help build a technology ecosystem that allows the seamless exchange of information internally and externally because transformation and insight will occur in both places. Last, CIOs need to recognize that every job description will include data and information skills. This is an especially exciting time to be in IT because the digital capabilities we provide increasingly affect every function and role. We need to help people develop the skills they need to take advantage ofwhat we can offer now and in the future.
    Source: deloitte.wsj.com, November 14, 2016
  • Dealing with the challenges of data migration

    Dealing with the challenges of data migration

    Once you have decided to migrate your data warehouse to a cloud-based database, the hard and risky work of data migration begins.

    Organizations of all sizes and maturities already have data warehouses deployed and in operation. Modernizing, upgrading, or otherwise improving an incumbent warehouse regularly involves migrating data from platform to platform, and migrations today increasingly move data from on-premises to cloud systems. This is because replatforming is a common data warehouse modernization strategy, whether you will rip-and-replace the warehouse's primary platform or augment it with additional data platforms.

    Even when using an augmentation strategy for data warehouse modernization, 'data balancing' is an inevitable migration task as you redistribute data across the new combination of old and new platforms.

    In a related direction, some data warehouse modernization strategies simplify bloated and redundant portfolios of databases (or take control of rogue data marts and analytics sandboxes) by consolidating them onto fewer platforms, with cloud-based databases increasingly serving as a consolidation platform.

    In all these modernization strategies, the cloud plays an important role. For example, many organizations have a cloud-first mandate because they know that cloud computing is the future of data center infrastructure. In addition, the cloud is a common target for data warehouse modernization because cloud-based data platforms are the most modern ones available for warehouses today.

    Finally, a cloud is an easily centralized and globally available platform, which makes it an ideal target for data consolidation, as well as popular use cases such as analytics, self-service data practices, and data sharing across organizational boundaries.

    Users who modernize a data warehouse need to plan carefully for the complexity, time, business disruption, risks, and costs of migrating and/or consolidating data onto cloud-based platforms suitable for data warehousing, as follows.

    Avoid a big bang project

    That kind of plan attempts to modernize and migrate too much too fast. The large size and complexity of deliverables raises the probability of failure. By comparison, a project plan with multiple phases will be a less risky way to achieve your goals for modernization and cloud migration. A multiphase project plan segments work into multiple manageable pieces, each with a realistic technical goal that adds discernable business value.

    The first deliverable should be easy but useful 

    For example, successful data migration or replatforming projects should focus the first phase on a data subset or use case that is both easy to construct and in high demand by the business. Prioritize early phases so they give everyone confidence by demonstrating technical prowess and business value. Save problematic phases for later.

    Cloud migration is not just for data 

    You are also migrating (or simply redirecting the access of) business processes, groups of warehouse end users, reports, applications, analysts, developers, and data management solutions. Your plan should explain when and how each entity will be migrated or redirected to cloud. Managers and users should be involved in planning to ensure their needs are addressed with minimal disruption to business operations.

    Manage risk with contingency plans

    Expect to fail, but know that segmenting work into phases has the added benefit of limiting the scope of failure. Be ready to recover from failed phases via roll back to a prior phase state. Don't be too eager to unplug the old platforms because you may need them for roll back. It is inevitable that old and new data warehouse platforms (both on premises and on clouds) will operate simultaneously for months or years depending on the size and complexity of the data, user groups, and business processes you are migrating.

    Beware lift-and-shift projects

    Sometimes you can 'lift and shift' data from one system to another with minimal work, but usually you cannot. Even when lift and shift works, developers need to tweak data models and interfaces for maximum performance on the new platform. A replatforming project can easily turn into a development project when data being migrated or consolidated requires considerable work.

    In particular, organizations facing migrations of older applications and data to cloud platforms should assume that lift and shift will be inadequate because old and new platforms (especially when strewn across on-premises and cloud systems) will differ in terms of interfaces, tool or platform functionality, and performance characteristics. When the new platform offers little or no backward compatibility with the old one, development may be needed for platform-specific components, such as stored procedures, user-defined functions, and hand-coded routines.

    Improve data, don't just move it

    Problems with data quality, data modeling, and metadata should be remediated before or during migration. Otherwise you're just bringing your old problems into the new platform. In all data management work, when you move data you should also endeavor to improve data.

    Assemble a diverse team for modernizing and replatforming a data warehouse 

    Obviously, data management professionals are required. Data warehouse modernization and replatforming usually need specialists in warehousing, integration, analytics, and reporting. When tweaks and new development are required, experts in data modeling, architecture, and data languages may be needed. Don't overlook the maintenance work required of database administrators (DBAs), systems analysts, and IT staff. Before migrating to a cloud-based data warehouse platform, consider hiring consultants or new employees who have cloud experience, not just data management experience. Finally, do not overlook the need for training employees on the new cloud platform.

    Data migrations affect many types of people 

    Your plan should accommodate them all. A mature data warehouse will serve a long list of end users who consume reports, dashboards, metrics, analyses, and other products of data warehousing and business intelligence. These people report to a line-of-business manager and other middle managers. Affected parties (i.e., managers and sometimes end users, too) should be involved in planning a data warehouse modernization and migration to cloud. First, their input should affect the whole project from the beginning so they get what they need to be successful with the new cloud data warehouse. Second, the new platform roll-out should take into consideration the productivity and process needs of all affected parties.

    Coordinate with external parties when appropriate

    In some scenarios, such as those for supply chain, e-commerce, and business-to-business relationships, the plan for migration to cloud should also stipulate dates and actions for partners, suppliers, clients, customers, and other external entities. Light technical work may be required of external parties, as when customers or suppliers have online access to reports or analytics supported by a cloud data warehouse platform.

    Author: Philip Russom

    Source: TDWI

  • Edge Computing in a Nutshell

    Edge computing in a Nutshell

    Edge computing (EC) allows data generated by the Internet of Things (IoT) to be processed near its source, rather than sending the data great distances, to data centers or a cloud. More specifically, edge computing uses a network of micro-data stations to process or store the data locally, within a range of 100 square feet. Prior to edge computing, it was assumed all data would be sent to the cloud using a large and stable pipeline between the edge/IoT device and the cloud.

    Typically, IoT devices transfer data, sometimes massive amounts, sending it all to a data center, or cloud, for processing. With edge computing, processing starts near the source. Once the initial processing has occurred, only the data needing further analysis is sent. EC screens the data locally, reducing the volume of data traffic sent to the central repository.

    This tactic allows organizations to process data in “almost” real time. It also reduces the network’s data stream volume and eliminates the potential for bottlenecks. Additionally, nearby edge devices can “potentially” record the same information, providing backup data for the system.

    A variety of factors are promoting the expansion of edge computing. The cost of sensors has been decreasing, while simultaneously, the pace of business continues to increase, with real-time responses providing a competitive advantage to its users. Businesses using edge computing can analyze and store portions of data quickly and inexpensively. Some are theorizing edge computing means an end to the cloud. Others believe it will complement and support cloud computing.

    The Uses of Edge Computing

    Edge computing can be used to help resolve a variety of situations. When IoT devices have a poor connectivity, or when the connection is intermittent, edge computing provides a convenient solution because it doesn’t need a connection to process the data, or make a decision.

    It also has the effect of reducing time loss, because the data doesn’t have to travel across a network to reach a data center or cloud. In situations where a loss of milliseconds is unacceptable, such as in manufacturing or financial services, edge computing can be quite useful.

    Smart cities, smart buildings, and building management systems are ideal for the use of edge computing. Sensors can make decisions on the spot, without waiting for a decision from another location. Edge computing can be used for energy and power management, controlling lighting, HVAC, and energy efficiency.

    A few years ago, PointGrab announced an investment in CogniPointTM, and its Edge Analytics sensor solution for smart buildings, by Philips Lighting and  Mitsubishi UFJ Capital. PointGrab is a company which provides smart sensor solutions to automated buildings.

    The company uses a deep learning technology in developing its sensors, which detects the occupant’s locations, maintains a head count, monitors their movements, and adjusts its internal environment using real-time analytics. PointGrab’s Chief Business Officer, Itamar Rothat stated:

    “CogniPoint’s ultra-intelligent edge-analytics sensor technology will be a key facilitator for capturing critical data for building operations optimization, energy savings improvement, and business intelligence.”

    Another example of edge computing is the telecommunication companies’ expansion of 5G cellular networks. Kelly Quinn, an IDC research manager, predicts telecom providers will add micro-data stations that are integrated into 5G towers, or located near the towers. Business customers can own or rent the micro-data stations for edge computing. (If rented, negotiate direct access to the provider’s broader network, which can then connect to an in-house data center, or cloud.)

    Edge Computing vs. Fog Computing

    Edge computing and fog computing both deal with processing and screening data prior to its arrival at a data center or cloud. Technically, edge computing is a subdivision of fog computing. The primary difference is where the processing takes place.

    With fog computing, the processing typically happens near the local area network (but technically, can happen anywhere between the edge and a data center/cloud), using a fog node or an IoT gateway to screen and process data. Edge computing processes data within the same device, or a nearby one, and uses the communication capabilities of edge gateways or appliances to send the data. (A gateway is a device/node that opens and closes to send and receive data. A gateway node can be part of a network’s “edge.”)

    Edge Computing Security

    There are two arguments regarding the security of edge computing. Some suggest security is better with edge computing because the data stays closer to its source and does not move through a network. They argue the less data stored in a corporate data center, or cloud, the less data that is vulnerable to hackers.

    Others suggest edge computing is significantly less secure because “edge devices” can be extremely vulnerable, and the more entrances to a system, the more points of attack available to a hacker. This makes security an important aspect in the design of any “edge” deployment. Access control, data encryption, and the use of virtual private network tunneling are important parts of defending an edge computing system.

    The Need for Edge Computing

    There is an ever-increasing number of sensors providing a base of information for the Internet of Things. It has traditionally been a source of big data. Edge computing, however, attempts to screen the incoming information, processing useful data on the spot, and sending it directly to the user. Consider the sheer volume of data being supplied to the Internet of Things by airports, cities, the oil drilling industry, and the smart phone industry. The huge amounts of data being communicated creates problems with network latency, bandwidth, and the most significant problem, speed. Many IoT applications are mission-critical, and the need for speed is crucial.

    EC can lower costs and provide a smooth flow of service. Mission critical data can be analyzed, allowing a business to choose the services running at the edge, and to screen data sent to the cloud, lowering IoT costs and getting the most value from IoT data transfers. Additionally, edge computing provides “Screened” big data.

    Transmitting immense amounts of data is expensive and can strain a network’s resources. Edge computing processes data from, or near, the source, and sends only relevant data through network to a data processor or cloud. For instance, a smart refrigerator doesn’t need to continuously send temperature data to a cloud for analysis. Instead, the refrigerator can be designed to send data only when the temperature changes beyond a certain range, minimizing unnecessary data. Similarly, a security camera would only send data after detecting motion.

    Depending on how the system is designed, edge computing can direct manufacturing equipment (or other smart devices) to continue operating without interruption, should internet connectivity become intermittent, or drop off, completely, providing an ideal backup system.

    It is an excellent solution for businesses needing to analyze data quickly in unusual circumstances, such as airplanes, ships, and some rural areas. For example, edge devices could detect equipment failures, while “not” being connected to a cloud or control system. Examples of edge computing include:

    Internet of Things

    • Smart streetlights
    • Home appliances
    • Motor vehicles (Cars and trucks)
    • Traffic lights
    • Thermostats
    • Mobile devices

    Industrial Internet of Things (IIoT)

    • Smart power grid technology
    • Magnetic resonance (MR) scanner
    • Automated industrial machines
    • Undersea blowout preventers
    • Wind turbines

    Edge Computing Compliments the Cloud

    The majority of businesses using EC continue to use the cloud for data analysis. They use a combination of the systems, depending on the problem. In some situations, the data is processed locally, and in others, data is sent to the cloud for further analysis. The cloud can manage and configure IoT devices, and analyze the “Screened” big data provided by Edge Devices. Combining the power of edge computing and the cloud maximizes the value of Internet of Things. Businesses will have the ability to analyze screened big data, and act on it with greater speed and precision, offering an advantage against competitors.

    Data Relationship Management

    Device Relationship Management (DRM) is about monitoring and maintaining equipment using the Internet, and includes controlling these “sensors on the edge.” DRM is designed specifically to communicate with the software and microprocessors of IoT devices and lets organizations supervise and schedule the maintenance of its devices, ranging from printers to industrial machines to data storage systems. DRM provides preventative maintenance support by giving organizations detailed diagnostic reports, etc. If an edge device is lacking the necessary hardware or software, these can be installed. Outsourcing maintenance on edge devices can be more cost effective at this time than hiring an in-house maintenance staff, particularly if the maintenance company can access the system by way of the internet.

    Author: Keith D. Foote

    Source: Dataversity

  • Four Drivers of Successful Business Intelligence

    BICompanies across industries face some very common scenarios when it comes to getting the most value out of data. The life science industry is no exception. Sometimes a company sets out to improve business intelligence (BI) for a brand, division or functional area. It spends many months or years and millions of dollars to aggregate all of the data it thinks it needs to better measure performance and make smart business decisions only to yield more data. In another familiar scenario, a team identifies critical questions the BI system can't answer. Again, months and millions go into development. But by the time the system goes live, market and/or company conditions have changed so much that the questions are no longer relevant.

    Building Better Business Intelligence Systems
    Today's challenges cannot be met by throwing more dollars into the marketing budget or by building more, or bigger, data warehouses. Ultimately, navigating today's complexities and generating greater value from data isn't about more, it's about better. The good news is that other industries have demonstrated the power and practicality of analytics at scale. Technology has evolved to overcome fragmented data and systems. We are now observing a real push in life sciences for a BI capability that's smarter and simpler.

    So how do we build better business intelligence platforms? In working with life sciences companies around the globe, IMS Health has observed a recurring journey with three horizons of business intelligence maturity: alignment of existing KPIs, generation of superior insights and customer-centric execution (see Figure 1).

    What does it take to advance in business intelligence maturity?
    No matter where a company currently stands, there are four fundamental steps that drive BI success: the ability to align business and information management strategy, improving information management systems integration and workflow, engineering BI systems to derive more value and insights from data, and making the most of new cloud computing technologies and Software-as-a-Service (SaaS) models for delivery.

    Step 1: Align Business and Information Management Strategy
    Many IT and business leaders recognize that the traditional "build it and they will come" mentality can no longer sustain future growth in agile and cost-efficient ways. To be successful, companies need to focus upfront on developing an information management strategy that begins with the business in mind. Through a top-down and upfront focus on critical business goals, drivers and pain points, companies can ensure that key insights are captured to drive development of commercial information management strategies that align with prioritized business needs. Leading organizations have achieved success via pilot-and-prove approaches that focus on business value at each step of the journey. To be successful, the approach must be considered in the context of the business and operational strategies.

    Step 2: Improving Information Management Systems Integration and Workflow
    Although technology systems and applications have proliferated within many organizations, they often remained siloed and sub-optimized. Interoperability is now a key priority and a vehicle for optimizing commercial organizations-improving workflow speed, eliminating conflicting views of the truth across departments and paring down vendor teams managing manual data handoffs. Information and master data management systems must be integrated to deliver an integrated view of the customer. When optimized, these systems can enable advanced BI capabilities ranging from improved account management and evolved customer interactions (i.e. account-based selling and management, insights on healthcare networks and relationships with influencers and KOLs) to harnessing the power of big data and demonstrating value to all healthcare stakeholders.

    Step 3: Engineering BI Systems to Derive More Value and Insights from Data
    Life sciences companies compete on the quality of their BI systems and their ability to take action in the marketplace. Yet existing analytics systems often fail to deliver value to end users. Confusing visualizations, poorly designed data queries and gaps in underlying data are major contributors in a BI solution's inability to deliver needed insights.

    By effectively redesigning BI applications, organizations can gain new insights and build deeper relationships with customers while maximizing performance. Effective BI tools can also help to optimize interventions and the use of healthcare resources. They can drive post-marketing research by unearthing early signals of value for investigation, help companies better engage and deliver value to their customers and contribute to improve patient outcomes. This information can advance the understanding of how medicine is practiced in the real world-from disease prevention through diagnosis, treatment and monitoring.

    Step 4: Making the Most of New Cloud Computing Technologies and Software-as-a-Service (SaaS) Models for Delivery
    Chief information officers (CIOs) are increasingly looking to adopt cloud technologies in order to bring the promise of technology to commercialization and business intelligence activities. They see the potential value of storing large, complex data sets, including electronic medical records and other real-world data, in the cloud. What's more, cloud companies have taken greater responsibility for maintaining government-compliant environments for health information.

    New cloud-based BI applications are fueling opportunities for life sciences companies to improve delivery of commercial applications, including performance management, advanced analytics, sales force automation, master data management and the handling of large unstructured data streams. As companies continue their journey toward BI maturity, getting the most from new technologies will remain a high priority. Leveraging cloud-based information management and business intelligence platforms will bring tremendous benefits to companies as approaches are revised amidst changing customer demands and an urgent need for efficiency.

    The Way Forward
    While each organization's journey will be unique, advancing in business intelligence maturity-and getting more value from data - can be achieved by all with these four steps. It's time for BI that's smarter and simpler and that realizes greater value from data. With focus and precision-and the support of business and technology experts-companies can hone in on the key indicators and critical questions that measure, predict and enhance performance.

    Source: ExecutiveInsight

  • Google koopt Anvato ter versterking van cloudplatform

    Google heeft Anvato opgekocht. Dat bedrijf regelt de codering, editing, publicatie en distributie van uiteenlopende video's over meerdere platformen. De zoekgigant wil Anvato bij zijn cloudplatform voegen en de technologie implementeren in zijn eigen diensten. Hoe Google dit voor ogen heeft is niet bekend.


    Wel is bekend dat Amerikaanse televisiezenders als NBCUniversal, Fox Sports en MSNBC gebruik maken van de diensten van Anvato bij het maken en aanbieden van online video’s. Het is een dienst die Google’s eigen cloudplatform nog niet aanbiedt, dus vermoedelijk ligt hierin de reden voor de aankoop.

    "Onze teams gaan samenwerken om cloudoplossingen te bieden om bedrijven in de media en entertainmentindustrie te helpen hun video-infrastructuur te schalen en hoge kwaliteit live-video’s en on-demand content aan consumenten te bieden op elk apparaat – of dat nou een smartphone, tablet of smart-tv is", stelt Google’s senior productmanager Belwadi Srikanth in een statement.

    Het is niet bekend hoeveel Google betaald heeft voor het bedrijf. Bij zijn oprichting in 2007 haalde Anvato zo’n 2,5 miljoen dollar op in een investeringsronde, maar ondertussen zal de waarde van het bedrijf flink gegroeid zijn.

    Bron: Techzine nieuws 

  • How enterprises can (try to) deal with cloud outages  

    How enterprises can (try to) deal with cloud outages

    Better observability tools can help net managers maintain some resilience to cloud service outages, but provider misconfigurations and DNS infrastructure issues are out of their control.

    Over the last year or so, major outages at cloud, Internet, and content delivery network providers significantly disrupted operations at businesses ranging from local mom-and-pop stores to international companies.

    Anxiety about such outages was high among the respondents to this year’s InformationWeek/Network Computing survey of 300 IT professionals involved in managing networks. More than half of the respondents said they were concerned or very concerned about the impact of cloud outages on their network operations. Such concerns were justified in that the leading cause of network availability in the last year was a significant outage due to an incident at a third-party provider.

    In a small number of outages, enterprises had a way to minimize the impact. For example, during the June 2022 Microsoft Azure and M365 Online 12-hour outage, most east coast companies served through the impacted Virginia data center could not access services. However, those companies with premium always-available or zone-redundant services in that region were not impacted.

    That gets to a point raised by some industry experts who believe you can have a cloud that is reliable, fast, or low cost—but you can only have two of the three. In the June incident, companies had the option to trade off reliability for cost (i.e., the cost of the premium services).

    A need to know what’s happening

    Unfortunately, enterprises had no workaround for many of the outages over the last year that had the greatest impact. Most were caused by updates gone wrong or configuration changes by the providers. And many were global in nature. Things like zone-redundant premium services would not help.

    So, what can network managers and enterprises do? Perhaps the most important things to do are to understand the operational vulnerabilities that could arise with a particular outage and keep pressure on the providers to improve their processes to avoid future outages.

    With regard to the first point, the complexity of modern apps makes it hard to even know how an outage might impact an organization. Many applications today are built using numerous components and services, some of which are third-party offerings.

    So, a bank might support many services, like the ability to check an account balance from a cellphone app via on-premises systems. But an online loan application might crash if the bank uses a third-party credit-worthiness database running on a cloud provider experiencing an outage.

    Last year, such interdependencies were significant when Facebook experienced a six-hour outage that took down numerous applications and services that used Facebook credentials for logins.

    These instances and others, where there are complex and often hidden dependencies, are driving demand for modern observability tools that help enterprises better understand the interplay of the various components that make up their apps and services. Such tools are also helpful in identifying root cause problems in today’s distributed and interlinked applications.

    Focus on the providers

    The cause of that Facebook outage, like the CloudFlare, Google Cloud, and many of the major outages over the last year, was due to a faulty configuration change.

    After diagnosing the problem, Facebook stated: “Our engineering teams have learned that configuration changes on the backbone routers that coordinate network traffic between our data centers caused issues that interrupted this communication. This disruption to network traffic had a cascading effect on the way our data centers communicate, bringing our services to a halt.”

    The outage resulted from a misconfiguration of Facebook's server computers, preventing external computers and mobile devices from connecting to the Domain Name System (DNS) and finding Facebook, Instagram, and WhatsApp.

    Perhaps the biggest lesson enterprise IT managers should take away from that outage is for companies to “avoid putting all of their eggs into one basket," says Chris Buijs, EMEA Field CTO at NS1. “In other words, they should not place everything, from DNS to all of their apps, on a single network.”

    Specifically, companies should use a DNS solution that is independent of their cloud or data center. If a provider goes down, a company will still have a functioning DNS to direct users to other facilities.

    That thinking has brought more attention to the core DNS system. All internet and cloud services are in trouble if anything happens to the 13 DNS services worldwide. That point has been embraced by the Internet Corporation for Assigned Names and Numbers (ICANN), the organization running the DNS system.

    Unfortunately, there are both technical and political issues to consider when it comes to the DNS system. The Ukraine war raised the possibility of using the DNS system to isolate Russia, cutting its websites off from the Internet. That request was rejected. "In our role as the technical coordinator of unique identifiers for the Internet, we take actions to ensure that the workings of the Internet are not politicized, and we have no sanction-levying authority,” said Göran Marby, ICANN president and CEO. “Essentially, ICANN has been built to ensure that the Internet works, not for its coordination role to be used to stop it from working."

    On the technical front, the Internet Assigned Numbers Authority (IANA), which is a group within ICANN that is responsible for maintaining the registries of Internet unique identifiers powering the DNS system, conducts two third-party audits each year on different aspects of its operations to ensure the security and stability of the Internet's unique identifier systems. The reviews cover infrastructure, software, computer security controls, and network security controls.

    Additionally, strengthening the security of the Domain Name System and the DNS Root Server System is one of the key tenants of ICANN’s five-year plan released in 2021.

    Key takeaways

    Enterprises are dependent on Internet and cloud services. Unlike the days of old, where a company could have a backup telecom provider, in most cases, there is only one provider today for any task at hand.

    There are very few options to minimize the impact of a major outage. In some cases, there may be premium high-availability offerings that help if a provider’s regional center goes down. But many providers do not offer such services, and most often, the outages are due to configuration errors that quickly propagate globally, so those services do not help.

    The best enterprise network managers can do is to use monitoring and observability tools and services to know when an outage has occurred and how it impacts their applications and services. The only other thing to do is pressure the providers to develop and carry out best practices to help avoid major service disruptions. 

    Author: Salvatore Salamone

    Source: Network Computing

  • How organizations can control the carbon emissions caused by using the cloud

    How organizations can control the carbon emissions caused by using the cloud

    The move to the cloud is not necessarily a carbon-free transition, which means businesses need to be folding cloud-based emissions into their overall ESG strategy.

    Cloud computing is an increasing contributor to carbon emissions because of the energy needs of data centers.

    With demand for digital services and cloud-based computing rising, industry efforts concentrated on energy efficiency will be required. This means organizations across all verticals must fold their cloud carbon footprint into their environmental, social, and governance (ESG) targets.

    This is especially true for those organizations that have committed to net-zero or science-based targets or other similar decarbonization commitments, as cloud computing would need to be accounted for in the calculations.

    Depending on an organization’s business model, and especially for companies that focus on digital services, the energy consumed through cloud computing can be a material portion of their overall emissions.

    In addition, shifting to the cloud can contribute to the reduction of the carbon footprint if it is approached with intent, and explicitly built into the DNA of technology deployment and management.

    Major Cloud Providers Offering Insight

    Casey Herman, PwC US ESG leader, explained that the major cloud service providers -- Google, Amazon, Microsoft -- are already providing data on energy usage and emissions on a regular basis.

    “Smaller players are still playing catch-up either providing online calculations, which require customers to be responsible for securing these values, or there is no information provided at all,” he says. “CIOs should have their operational teams monitor these and preferentially select those service providers that provide real-time tools to optimize the energy usage.”

    He notes that CIOs should also increasingly build or purchase tools that allow a holistic view across all the cloud computing impacts: Currently, they would need to look at each provider separately and then aggregate them external to any tools that may be provided by service providers.

    “At PwC, we have been piloting an IT sustainability dashboard that collects data from public cloud providers and on-premises systems and then provides views on key sustainability metrics like energy reuse efficiency or carbon usage effectiveness,” he adds.

    Herman says that ultimately, organizations are seeking greater use of data for more advanced analysis, which will consume increasingly more computing power, which translates to more energy.

    “Cloud service providers have been quick to reduce their carbon footprints, including public statements and investing money in renewables and carbon capture projects,” he says. “These organizations are putting in a carbon-neutral infrastructure that could then support the current and growing demand for data, analytics, and computing power.”

    Using Migration to Install Tools

    In fact, shifting to the cloud (provided it's the right provider) could reduce a company's carbon footprint through optimization and rationalization of on-premises/private data centers to more efficient (energy and carbon) cloud-based data centers.

    A company can also use their cloud migration program as a catalyst to transform their technology footprint and become environmentally conscious by design.

    Herman says that this can include re-architecting applications and building within enterprise architecture a strategy to utilize more discrete and reusable components (microservices, APIs), preventing wasteful use of energy in the cloud.

    The key to getting cloud carbon impact initiatives underway is aligning the ambition and strategy of the overall business with the IT and digital function around ESG and being an active champion of the ESG agenda within the organization.

    “Without the tools to measure the carbon footprint of their cloud footprint, companies will struggle to holistically aggregate relevant carbon impact for their IT department or manage to net zero, especially when these represent meaningful parts of their overall footprint,” Herman says.

    He explains that measurement tools and processes will also allow the organization to leverage that same data and insights to support decarbonization agendas and strategies in the business.

    AI Provides Insight into Cloud Emissions

    For Chris Noble, co-founder and CEO of Cirrus Nexus, the focus for his company has been on an artificial intelligence designed to help companies quantify and shrink the level of carbon their cloud operations produce.

    “By giving organizations the chance to impose a cost on that carbon, it allows them to make a better-informed business decision as their impact on the environment, and then to drive that actual behavior,” he says.

    By giving businesses a window into how much emissions their cloud computing demands are producing, those organizations are then able to form a roadmap that will help their ESG strategy.

    This is a part of transparency reporting, which Noble notes will be increasingly required through government regulations.

    “There's a lot of people making claims about carbon neutrality, but there's no way to verify that -- there's no proof,” he says. “What we allow companies to do is to see what that activity is.”

    He says that for IT departments to understand cloud-based carbon emissions as a business problem, they need parameters and metrics by which they can tag on cost on the issue and work toward resolving it.

    “How do we educate, inform and drive that behavioral change across their environments?” Noble says. “We spend a lot of time doing that.”

    Reliable Data Intelligence is Critical

    Elisabeth Brinton, Microsoft’s corporate vice president of sustainability, says that accurate, reliable data intelligence is critical for the success of ESG initiatives.

    “For organizations to truly address the sustainability imperative, they need continuous visibility and transparency into the environmental footprint of their entire operations, their products, the activities of their people and their value chain,” she says.

    Just as organizations rely on real-time financial reporting and forecasts to guide decisions that affect the fiscal bottom line, they need foundational intelligence to inform sustainability-related decisions.

    “Leveraging a cloud platform offers organizations comprehensive, integrated, and increasingly automated sustainability insights to help monitor and manage their sustainability performance,” Brinton says.

    With cloud technology and a partner ecosystem, cloud providers like Microsoft are also bringing integrated solutions to connect organizations and their value chain, ultimately helping organizations integrate sustainability into their culture, activities, and processes to prioritize actions to minimize their environmental impact.

    Microsoft Cloud for Sustainability is the company’s first horizontal industry cloud designed to work across multiple industries, with solutions that can be customized to specific industry needs. At its core is a data model that aligns with Greenhouse Gas Protocols -- the standard in identifying and recording emissions information.

    Brinton explains as the company operationalizes its sustainability plan, Microsoft is sharing its expertise and developing tools and methods customers can replicate.

    “We’re also thinking about where we’re going, what we have to solve as a company to walk our own talk, and how we’re going to enable our customers to deal with that complexity so that at the end, they’re coming out on the other side as well,” Brinton says.

    The Customer Demand for Clean Clouds

    Kalliopi Chioti, chief ESG officer at financial services software firm Tememos, notes banks are heavy users of datacenters and so being a part of this positive trend -- moving from legacy on-premises servers to modern cloud infrastructure -- will have a significant impact on emissions.

    Temenos Banking Cloud, the company’s next-generation SaaS, incorporates ESG-as-a-service to help banks reduce their energy and emissions, gain carbon insights from using their products, and to track their progress towards reaching their sustainability targets.

    It also runs on public cloud infrastructure, and the hyperscalers Temenos partners with have all made commitments to sustainability goals, science-based targets and using 100% renewable energy. “All these energy efficiencies are passed onto our clients,” Chioti says. “Let’s also remember that banks are in a unique position to influence the transition to a low-carbon economy.”

    She points out that the move to the cloud also has commercial implications: Consumers are not passive bystanders to the climate agenda, and they are increasingly matching their money with their values and voting with their wallets.

    “If companies want to continue to thrive and grow in the new era, they need to listen to their customers,” she says. “That starts with using cloud banking solutions to transform their climate credentials and show their customers the work they are doing to transition to a low-carbon global economy.”

    Author: Nathan Eddy

    Source: InformationWeek

  • How serverless machine learning and choosing the right FaaS benefit AI development

    How serverless machine learning and choosing the right FaaS benefit AI development

    Getting started with machine learning throws multiple hurdles at enterprises. But the serverless computing trend, when applied to machine learning, can help remove some barriers.

    IT infrastructure that enables rapid scaling, integration and automation is a greatly valued commodity in a marketplace that is evolving faster. And fast, serverless machine learning is a primary example of that.

    Serverless computing is a cloud-based model wherein a service provider accepts code from a customer, dynamically allocates resources to the job and executes it. This model can be more cost-effective than conventional pay-or-rent server models. Elasticity replaces scalability, relieving the customer of deployment grief. Code development can be far more modular. And headaches from processes like HTTP request processing and multithreading vanish altogether.

    It's as efficient as development could possibly be from the standpoint of time and money: The enterprise pays the provider job by job, billed only for the resources consumed in any one job execution. This simple pay-as-you-go model frees up enterprise resources for more rapid app and service development and levels the playing field for development companies not at the enterprise level.

    Attractive as it is, how can this paradigm accommodate machine learning, which is becoming a mission-critical competitive advantage in many industries?

    A common problem in working with machine learning is moving training models into production at scale. It's a matter of getting the model to perform for a great many users, often in different places, as fast as the users need it to do so. Nested in this broad problem is the more granular headache of concept drift, as the model's performance degrades over time with increasing variations in data, which causes such models to need frequent retraining. And that, in turn, creates a versioning issue and so on.

    Function as a service

    Function as a service (FaaS) is an implementation of serverless computing that works well for many application deployment scenarios, serverless machine learning included. The idea is to create a pipeline by which code is moved, in series, from testing to versioning to deployment, using FaaS throughout as the processing resource. When the pipeline is well-conceived and implemented, most of the housekeeping difficulties of development and deployment are minimized, if not removed.

    A machine learning model deployment adds two steps to this pipeline:

    • training, upon which the model's quality depends; and
    • publishing: timing the go-live of the code in production, once it's deployed.

    FaaS is a great platform for this kind of process, given its versatile and flexible nature.

    All the major public clouds provide FaaS. The list begins with AWS Lambda, Microsoft Azure Functions, Google Cloud Functions and IBM Cloud Functions, and it includes many others.

    Easier AI development

    The major FaaS function platforms accommodate JavaScript, Python and a broad range of other languages. For example, Azure Functions is Python- and JavaScript-friendly.

    Beyond the languages themselves, there are many machine learning libraries available through serverless machine learning offerings: TensorFlow, PyTorch, Keras, MLpack, Spark ML, Apache MXNet and a great many more.

    A key point about AI development in the FaaS domain is it vastly simplifies the developer's investment in architecture: autoscaling is built in; multithreading goes away, as mentioned above; fault tolerance and high availability are provided by default.

    Moreover, if machine learning models are essentially functions handled by FaaS, then they are abstracted and autonomous in a way that relieves timeline pressure when different teams are working with different microservices in an application system. The lives of product managers get much easier.

    Turnkey FaaS machine learning

    Vendors are doubling down on the concept of serverless computing and continuing to refine their options. Amazon, Google and others have services set aside to do your model training for you, on demand: Amazon SageMaker and the Google Cloud ML Engine are two of many such services, which also include IBM Watson Machine Learning, Salesforce Einstein and Seldon Core, which is open source.

    These services do more than just train machine learning models. Many serverless machine learning offerings handle the construction of data sets to be used in model training, provide libraries of machine learning algorithms, and configure and optimize the machine learning libraries mentioned above.

    Some offer model tuning, automated adjustments of algorithm parameters to tweak the model to its highest predictive capacity.

    The FaaS you choose, then, could lead to one-stop shopping for your machine learning application.

    Author: Scott Robinson

    Source: TechTarget

  • IBM verdient beter aan lagere omzet


    Valutaire tegenwind, het afstoten van bedrijfsonderdelen en lagere marges in groeisegment de cloud hebben de omzet van IBM op bijna alle fronten doen dalen. Maar IBM hield wel meer over aan zijn bedrijfsactiviteiten

    IBM rapporteerde over het vierde kwartaal een omzet van 24,1 miljard dollar. Dat was bijna 12 procent minder dan in het vierde kwartaal van 2013. Ook over het hele jaar opgeteld daalde de omzet. Met 92,8 miljard dollar kwam de omzet in 2014 5,7 procent lager uit dan in 2013.

    De omzetdaling lijkt spectaculairder dan die in feite is. In 2013 had IBM in verschillende divisies nog inkomsten van de System X-divisie - die het afgelopen jaar werd verkocht aan Lenovo. Daarnaast treden er verschillen op door de uitbesteding van klantenservice en door de koersstijging van de dollar. Gecorrigeerd voor die factoren bleef de daling in het vierde kwartaal beperkt tot 2 procent. Wat de invloed ervan was op de jaaromzet, specificeert IBM niet.

    Cloud eist zijn tol
    Dat IBM ook kampt met structurele veranderingen in de markt, blijkt echter uit het wel en wee van de divisie software. Omzetdaling was in die divisie jarenlang ondenkbaar. Maar in het vierde kwartaal daalde de omzet daar met een kleine 7 procent. Gecorrigeerd voor valutaschommelingen resteert nog altijd een min van 3 procent. Over het gehele jaar gerekend nam de omzet in software met 2 procent af. Die negatieve ontwikkeling reflecteert de opkomst van cloud. Met clouddiensten haalde IBM in 2014 7 miljard dollar binnen, 60 procent meer dan in 2013, maar de schaal van zijn clouddiensten is naar eigen zeggen nog onvoldoende om de lagere marges te compenseren.

    Hogere brutomarge, lagere nettowinst
    Desalniettemin wist IBM het bedrijfsresultaat als percentage van de omzet nog iets op te schroeven. De bruto winstmarge over 2014 was 50 procent, die over 2013 was 49,5 procent. In het vierde kwartaal was het gat met 2013 nog iets groter: 53,3 om 52,4 procent.

    Die toegenomen efficiëntie vertaalde zich niet in een hogere netto winst. Na belastingen, afschrijvingen en bijzondere lasten boekte IBM in het vierde kwartaal een netto winst van 5,5 miljard dollar, 11 procent minder dan in het vierde kwartaal van 2014. Over het hele jaar gerekend nam de netto winst zelfs met 27 procent af tot 12 miljard dollar. Behalve de valutaschommelingen spelen daarbij ook enkele bijzondere kostenposten een rol, zoals de kosten van inkrimp van de divisie micro-electronica en een voorziening van 580 miljoen dollar voor inkrimping van het personeelsbestand.

     

    Automatiseringsgids, 21 janauri 2015

  • Insights from Dresner Advisory Services’ 2016 The Internet of Things and Business Intelligence Market Study

    • Sales and strategic planning teams see IoT as the most valuable.
    • IoT advocates are 3X as likely to consider big data critical to the success of their initiatives & programs.
    • Amazon and Cloudera are the highest ranked big data distributions followed by Hortonworks and Map/R.
    • Apache Spark MLib is the most known technology on the nascent machine learning landscape today.

    These and many other excellent insights are from Dresner Advisory Services’ 2016 The Internet of Things and Business Intelligence Market Study published last month. What makes this study noteworthy is the depth of analysis and insights the Dresner analyst team delivers regarding the intersection of big data and the Internet of Things (IoT), big data adoption, analytics, and big data distributions. The report also provides an analysis of Cloud Business Intelligence (BI) feature requirements, architecture, and security insights. IoT adoption is thoroughly covered in the study, with a key finding being that large organizations or enterprises are the strongest catalyst of IoT adoption and use. Mature BI programs are also strong advocates or adopters of IoT and as a result experience greater BI success. IoT advocates are defined as those respondents that rated IoT as either critical or very important to their initiatives and strategies.

    Key takeaways of the study include the following:

    • Sales and strategic planning see IoT as the most valuable today.The combined rankings of IoT as critical and very important are highest for sales, strategic planning and the Business Intelligence (BI) Competency Centers. Sales ranking IoT so highly is indicative of how a wide spectrum of companies, from start-ups to large-scale enterprises, is attempting to launch business models and derive revenue from IoT. Strategic planning’s prioritization of IoT is also driven by a long-term focus on how to capitalize on the technology’s inherent strengths in providing greater contextual intelligence, insight, and potential data-as-a-service business models.

    IoT-Importance-by-Function-cp

    • Biotechnology, consulting, and advertising are the industries that believe IoT is the most important to their industries.Adoption of IoT across a wide variety of industries is happening today, with significant results being delivered in manufacturing, distribution including asset management, logistics, supply chain management, and marketing. The study found that the majority of industries see IoT as not important today, with the exception of biotechnology.

    IOT-Importance-by-Industry-cp

    • Location intelligence, mobile device support, in-memory analysis, and integration with operational systems are the four areas that most differentiate IoT advocates’ interests and focus.Compared to the overall sample of respondents, IoT advocates have significantly more in-depth areas of focus than the broader respondent base. The four areas of location intelligence, mobile device support, in-memory analysis, and integration with operational systems show they have a practical, pragmatic mindset regarding how IoT can contribute greater process efficiency, revenue and integrate with existing systems effectively.

    IoT-Advocates-Circle-cp1

    • An organization’s ability to manage big data analytics is critically important to their success or failure with IoT. IoT advocates are 3X as likely to consider big data critical, and 2X as likely to consider big data very important. The study also found that IoT advocates see IoT as a core justification for investing in and implementing big data analytics and architectures.

    importance-of-big-data-cp

    • Data warehouse optimization, customer/social analysis, and IoT are the top three big data uses cases organizations are pursuing today according to the study. Data warehouse optimization is considered critical or very important to 50% of respondents, making this use case the most dominant in the study. Large-scale organizations are adopting big data to better aggregate, analyze and take action on the massive amount of data they generate daily to drive better decisions. One of the foundational findings of the study is that large-scale enterprises are driving the adoption of IoT, which is consistent with the use case analysis provided in the graphic below.

    big-data-use-cases-with-cp

    • IoT advocates are significantly above average in their use of advanced and predictive analytics today. The group of IoT advocates identified in the survey is 50% more likely to be current users of advanced and predictive analytics apps as well. The study also found that advanced analytics users tend to be the most sophisticated and confident BI audience in an organization and see IoT data as ideal for interpretation using advanced analytics apps and techniques.

    advanced-and-predictive-analytics-cp

    • Business intelligence experts, business analysts and statisticians/data scientists are the greatest early adopters of advanced and predictive analytics. More than 60% of each of these three groups of professionals is using analytics often, which could be interpreted as more than 50% of their working time.

    users-of-advanced-and-predictive-analytics-cp

    • Relational database support, open client connectors (ODBC, JDBC) and automatic upgrades are the three most important architectural features for cloud BI apps today. Connectors and integration options for on-premises applications and data (ERP, CRM, and SCM) are considered more important than cloud application and database connection options. Multitenancy is considered unimportant to the majority of respondents. One factor contributing to the unimportance of multi-tenancy is the assumption that this is managed as part of the enterprise cloud platform.

    Cloud-BI-Architectural-Requirements-cp

    • MapReduce and Spark are the two most known and important big data infrastructure technologies according to respondents today. 48% believe that MapReduce is important and 42% believe Spark is. The study also found that all other categories of big data infrastructure are considered less important as the graphic below illustrates.

    big-data-infrastructure-cp

     Forbes, 4 oktober 2016

  • Keeping the data of your organization safe by storing it in the cloud

    Keeping the data of your organization safe by storing it in the cloud

    We now live within the digital domain, and accessing vital information is more important than ever. Up until rather recently, most businesses tended to employ on-site data storage methods such as network servers, SSD hard drives, and direct-attached storage (DAS). However, cloud storage systems have now become commonplace.

    Perhaps the most well-known benefit of cloud storage solutions is that their virtual architecture ensures that all information will remain accessible in the event of an on-site system failure. However, we tend to overlook the security advantages of cloud storage with traditional strategies. Let us examine some key takeaway points.

    Technical Experts at Your Disposal

    A recent survey found that 73% of all organizations felt that they were unprepared in the event of a cyberattack. As this article points out, a staggering 40% suspected that their systems had been breached. It is therefore clear that legacy in-house approaches are failing to provide adequate security solutions.

    One of the main advantages of cloud-based data storage is that these services can provide targeted and customized data security solutions. Furthermore, a team of professionals is always standing by if a fault is suspected. This enables the storage platform to quickly diagnose and rectify the problem before massive amounts of data are lost or otherwise compromised. 

    Restricted Digital Access

    We also need to remember that one of the most profound threats to in-house data storage involves its physical nature. In other words, it is sometimes possible for unauthorized users (employees or even third parties) to gain access to sensitive information. Not only may this result in data theft, but the devices themselves could be purposely sabotaged, resulting in a massive data loss.

    The same cannot be said of cloud storage solutions. The information itself could very well be stored on a server located thousands of miles away from the business in question. This makes an intentional breach much less likely. Other security measures such as biometric access devices, gated entry systems, and CCTV cameras will also help deter any would-be thieves. 

    Fewer (if Any) Vulnerabilities

    The number of cloud-managed services is on the rise, and for good reason. These platforms allow businesses to optimize many factors such as CRM, sales, marketing campaigns, and e-commerce concerns. In the same respect, these bundles offer a much more refined approach to security. 

    This often comes with the ability to thwart what would otherwise remain in-house vulnerabilities. Some ways in which cloud servers can offer more robust storage solutions include:

    • 256-bit AES encryption
    • Highly advanced firewalls
    • Automatic threat detection systems
    • Multi-factor authentication

    In-house services may not be equipped with such protocols. As a result, they can be more vulnerable to threats such as phishing, compromised passwords, and distributed denial-of-service (DdoS) attacks. 

    The Notion of Data Redundancy

    The “Achilles’ heel” of on-site data storage has always stemmed from its physical nature. This is even more relevant when referring to unexpected natural disasters. Should a business endure a catastrophic situation, sensitive data could very well be lost permanently. This is once again when cloud storage solutions come into play.

    The virtual nature of these systems ensures that businesses can enjoy a much greater degree of redundancy. As opposed to having an IT team struggle for days or even weeks at a time to recover lost information, cloud servers provide instantaneous access to avoid potentially crippling periods of downtime. 

    Doing Away with Legacy Technology

    Another flaw that is often associated with in-house data storage solutions involves the use of legacy technology. Because the digital landscape is evolving at a frenetic pace, the chances are high that many of these systems are no longer relevant. What could have worked well yesterday may very well be obsolete tomorrow. Cloud solutions do not suffer from this drawback. Their architecture is updated regularly to guarantee that customers are always provided with the latest security protocols. Thus, their vital information will always remain behind closed (digital) doors.

    Brand Reputation

    A final and lesser-known benefit of cloud-based security is that clients are becoming more technically adept than in the past. They are aware of issues such as the growth of big data and GDPR compliance concerns. The reputation of businesses that continue to use outdated storage methods could therefore suffer as a result. Customers who are confident that their data is safe are much more likely to remain loyal over time. 

    Cloud Storage: Smart Solutions for Modern Times

    We can now see that there are several security advantages that cloud storage solutions have to offer. Although on-site methods may have been sufficient in the past, this is certainly no longer the case. Thankfully, there are many cloud providers associated with astounding levels of security. Any business that hopes to remain safe should therefore make this transition sooner rather than later. 

    Author: George Tuohy

    Source: Dataversity

  • Keeping your data safe in an era of cloud computing

    Keeping your data safe in an era of cloud computing

    These cloud security practices for 2020 are absolutely essential to keep your data safe and secure in this new decade. 

    In recent years, cloud computing has gained increasing popularity and proved its effectiveness. There is no doubt that cloud services are changing the business environment. Small companies value the ability to store documents in the cloud and conveniently manage them. Large business players appreciate the opportunity to save money on the acquisition and maintenance of their own data storage infrastructure. The movement towards cloud technologies is perceived as an undoubtedly positive trend that facilitates all aspects of human interaction with information systems.

    Despite the obvious benefits of cloud technologies, there is a set of problematic issues that pose a significant threat to cloud users, such as:

    • The degree of trust to the cloud service provider;
    • Ensuring confidentiality, integrity, relevance, and incontrovertibility of information at all levels;
    • Loss of data control and data leaks;
    • Protection against unauthorized access;
    • Malicious insiders;
    • Saving personal data of users transmitted and processed in the cloud.

    Although cloud computing today is no longer a new technology, issues of ensuring data security represents a relevant point for users worldwide. Security concerns remain to be the main obstacle to the widespread adoption of cloud technologies. What are the main threats to cloud security today? How will they affect the industry? What measures are essential to keep your sensitive data confidential? Read on to figure it out!

    Risks associated with cloud computing

    As you can guess, cloud computing servers have become a very attractive target for hackers. A virtual threat is associated with the possibility of remote penetration due to the vulnerable infrastructure. Cybercriminal groups often steal users’ data for the purposes of blackmailing and committing various frauds. As a rule, cybercriminals focus on small business networks because they are easier to breach. At the same time, the cases of data leakages among large corporation still take place. Fraudsters often go after larger companies because of the allure of larger payouts.

    In November 2018, Marriott International announced that cyber thieves stole data on 500 million customers. The attackers’ targets were contact info, passport number, Starwood Preferred Guest numbers, travel information, credit card numbers and expiration dates of more than 100 million customers. Moreover, police officials have noted that the 'human factor' was directly related to the problem. Employees did not follow all security rules, which made the system vulnerable to hacker attacks.

    Security threats

    When a cloud service vendor supplies your business and stores your corporate data, you place your business in the partner’s hands. According to Risk Based Security research published in the 2019 MidYear QuickView Data Breach Report, during the first six months of 2019, there were more than 3,800 publicly disclosed breaches exposing 4.1 billion compromised records.

    In case you entrust your data to the cloud provider, you should be confident about the reliability of the cloud server. Thus, it is essential to be aware of the existing risk to prevent disclosure of your sensitive information.

    The cloud computing system can be exposed to several types of security threats, which can be divided into the following groups:

    • Threats to the integrity;
    • Threats to confidentiality;
    • Accessibility risks;
    • Authorization risks;
    • Browser vulnerabilities.

    Data security

    Nobody wants their personal information to be disclosed to the broad audience. However, according to Forbes research, unsecured Facebook databases leakages affected more than 419 million users.The principles of virtual technology pose potential threats to the information security of cloud computing associated with the use of shared data warehouses. When the data is transmitted from one VM to another, there is a risk of disclosure from a third party.

    Threats related to the functioning of virtual machines

    Virtual machines are dynamic. They are cloned and can move between physical servers. This variability affects the development of the integrity of the security system. However, vulnerabilities of the OS or applications in a virtual environment spread unchecked and often manifest after an arbitrary period of time (for example, when restoring from a backup). In a cloud computing environment, it is important to securely record the security status of the system, regardless of its location.

    Vulnerability of virtual environment

    Another major risk you may face is vulnerability within the virtual environment. Cloud computing servers and on-premises servers use the same OS and applications. For cloud systems, the risk of remote hacking or malware infection is high. An intrusion detection and prevention systems are installed to detect malicious activity at the virtual machine level, regardless of their location in the cloud.

    Blurring of network perimeter

    When you sign in your cloud, the network perimeter is blurred or disappears. This leads to the fact that the protection of the less secure part of the network determines the overall level of security. To distinguish between segments with different levels of trust in the cloud, virtual machines must be provided with protection by moving the network perimeter to the virtual machine itself. A corporate firewall is the main component for implementing IT security policies and delimiting network segments that will protect your business from undesired disclosure.

    Attacks on hypervisor

    The hypervisor is one of the key elements of a virtual system. Its main function lies in the sharing of resources across virtual machines. An attack on a hypervisor can help one virtual machine (usually installed on the fraudsters’ side) to gain access to the memory and resources of another. To secure your data, it is recommended to use specialized products for virtual environments, integrate host servers with the Active Directory service, use high password complexity and expiration policies, standardize procedures for accessing host server management tools, and use the built-in virtualization host firewall. It is also possible to disable frequently unused services such as web access to the virtualization server.

    Solutions to decrease cloud computing risks

    Encryption

    As you already know, most of the problems related to cloud technologies can be solved with the help of cryptographic information protection. Encryption is one of the most effective ways to protect data. The provider must encrypt the client’s information stored in the data center and also permanently delete it after it is removed from the server. Encryption makes users’ data useless for any person who does not have the keys to decrypt it. The owner of the encryption keys maintains data security and decides to whom, and to what degree their access should be provided.

    Encrypted data is available after authentication only. This data cannot be read or changed, even in cases of access through untrusted nodes. Such technologies are well known, and algorithms and reliable protocols AES, TLS, IPsec have long been used by providers.

    Authentication

    Authentication is an approach to ensure data security. In simple terms, it can be defined as a reliable password protection method. Certification and tokens can also be used to gain a higher level of reliability. For instance, such protocols as LDAP (Lightweight Directory Access Protocol) and SAML (Security Assertion Markup Language) can ensure your sensitive data is stored securely on the cloud server.

    Conclusion

    In these times, data security is more important than ever. Be sure to enact key cloud security measures as we head into 2020.

    Author: Ryan Kh

    Source: Smart Data Collective

  • Making your Organization more intelligent with a Cloud Data Strategy

    Making your Organization more intelligent with a Cloud Data Strategy

    At a time when most major companies are showing a long-range commitment to “data-driven culture,” data is considered the most prized asset. An Enterprise Data Strategy, along with aligned technology and business goals, can significantly contribute to the core performance metrics of a business. The underlying principles of an Enterprise Data Strategy comprise a multi-step framework, a well-designed strategy process, and a definitive plan of action. However, in reality, very few businesses today have their Data Strategy aligned with overall business and technology goals.

    Data Management Mistakes Are Costly

    Unless the overall business and technology goals of a business are aligned with a Data Strategy, the business may suffer expensive Data Management failure incidents from time to time. If the Data Strategy is implemented in line with a well-laid out action plan that seeks to transform the current state of affairs into “strategic Data Management initiatives” leading to the fulfillment of desirable business needs and objectives in the long term, then there is a higher chance of that Data Strategy achieving the desired outcomes. 

    Data provides “insights” that businesses use for competitive advantage. When overall business goals and technology goals are left out of the loop of an Enterprise Data Strategy, the data activities are likely to deliver wrong results, and cause huge losses to the business.

    What Can Businesses Do to Remain Data-Driven?

    Businesses that have adopted a data-driven culture and those expecting to do so, can invest some initial time and effort to explore the underlying relationships between the overall business goals, technology goals, and Data Strategy goals. The best part is they can use their existing advanced analytics infrastructure to make this assessment before drafting a policy document for developing the Data Strategy.

    This initial investment in time and effort will go a long way toward ensuring that the business’s core functions (technology, business, and Data Science) are aligned and have the same objectives. Without this effort, the Data Strategy can easily become fragmented and resource-heavy—and ineffective.

    According to Anthony Algmin, Principal at Algmin Data Leadership, “Thinking of a Data Strategy as something independent of Business Strategy is a recipe for disaster.”

    Data Governance has recently become a central concern for data-centric organizations, and all future Data Strategies will include Data Governance as a core component. The future Data Strategy initiatives will have to take regulatory compliances seriously to ensure long-term success of such strategies. The hope is that this year, businesses will employ advanced technologies like big data, graph, and machine learning (ML) to design and implement a strong Data Strategy.

    In today’s digital ecosystem, the Data Strategy means the difference between survival and extinction of a business. Any business that is thinking of using data as a strategic asset for predetermined business outcomes must invest in planning and developing a Data Strategy. The Data Strategy will not only aid the business in achieving the desired objectives, but will also keep the overall Data Management activities on track.

    A Parallel Trend: Rapid Cloud Adoption

    As Data Strategy and Data Governance continue to gain momentum among global businesses, another parallel trend that has surfaced is the rapid shift to cloud infrastructures for business processing.

    With on-premise Data Management practices, Cloud Data Management practices also revolve around MDM, Metadata Management, and Data Quality. As the organizations continue their journey to the cloud, they will need to ensure their Data Management practices conform to all Data Quality and Data Governance standards.

    A nagging concern among business owners and operators who have either shifted to the cloud or are planning a shift is data security and privacy. In fact, many medium or smaller operations have resisted the cloud as they are unsure or uninformed about the data protection technologies available on the cloud. Current businesses owners expect cloud service providers to offer premium data protection services.

    The issues around Cloud Data Management are many: the ability of cloud resources to handle high-volume data, the security leaks in data transmission pipelines, data storage and replication policies of individual service providers, and the possibilities of data loss from cloud hosts. Cloud customers want uninterrupted data availability, low latency, and instant recovery—all the privileges they have enjoyed so far in an on-premise data center.

    One technology solution often discussed in the context of cloud data protection is JetStream. Through a live webinar, Arun Murthy, co-founder and Chief Product Officer of Horton Works, demonstrated how the cloud needs to be a part of the overall Data Strategy to fulfill business needs like data security, Data Governance, and holistic user experience. The webinar proceedings are discussed in Cloud Computing—an Extension of Your Data Strategy.

    Cloud Now Viewed as Integral Part of Enterprise Data Strategy

    One of the most talked about claims made by industry experts at the beginning of 2017 was that it “would be a tipping point for the cloud.” These experts and cloud researchers also suggested that the cloud would bring transformational value to business models through 2022, and would become an inevitable component of business models. According to market-watcher Forrester, “cloud is no longer about cheap servers or storage, (but), the best platform to turn innovative ideas into great software quickly.

    As cloud enables big data analytics at scale, it is a popular computing platform for larger businesses who want the benefits without having to make huge in-house investments. Cloud holds promises for medium and small businesses, too, with tailor-made solutions for custom computing needs at affordable cost.

    The following points should be kept in mind while developing a strategy plan for the cloud transformation:

    • Consensus Building for Cloud Data Strategy: The core requirement behind building a successful Data Strategy for the cloud is consensus building between the central IT Team, the cloud architect, and the C-Suite executives. This problem is compounded in cases where businesses may be mix-matching their cloud implementations.
    • Data Architectures on Native Cloud: The news feature titled Six Key Data Strategy Considerations for Your Cloud-Native Transformation throws light on cloud-native infrastructure, which is often ignored during a business transformation. According to this article, though enterprises are busy making investments in a cloud-native environment, they rarely take the time to plan the transformation, thus leaving Data Architecture issues like data access and data movement unattended. 
    • Creating Data Replicas: Data replication on the cloud must avoid legacy approaches, which typically enabled data updating after long durations.
    • Data Stores across Multiple Clouds: HIT Think: How to Assess Weak Links in a Cloud Data Strategy specifically refers to storage of healthcare data, where data protection and quick data recovery are achieved through the provisioning of multiple cloud vendors. These solutions are not only cost-friendly, but also efficient and secure. 

    Author: Paramita (Guha) Ghosh

    Source: Dataversity

  • Moving your projects to the cloud, but why?

    Moving your projects to the cloud, but why?

    Understanding the cloud main advantages and disadvantages

    In this article, we are going to change the context slightly. In the last articles, we have been talking about data management, the importance of data quality, and business analytics. This time, I am very excited to announce to you that we are going to explore, over the next few weeks, a current trend that will affect all companies in the decade in which we find ourselves: the cloud. I know that the topic cloud is very broad since it has a lot of concepts so we’ll focus on data in the cloud.

    I thinkby now, we have all heard about the cloud and its capabilities but, do you know all the benefits and implications it has? In this first post, I would like to explore the basic concepts of the cloud with you, and in the next few weeks, accompany you on a trip about how we can find relevant insights using the cloud resources.

    First of all, I want you to understand why this post is for you. So, if you are…

    an individual, whether you’re in business or tech, you need to understand these concepts and how the cloud is changing the game.

    a company, you must have a cloud strategy. We are not talking about having your workload 100% migrated to the cloud tomorrow, but you should have a roadmap for the next few years.

    What is cloud computing?

    At this point, I would like to define what cloud computing is. Since 2017 an infinite amount of statements have been distributed over social networks saying:

    ''Cloud computing is just someone’s else computer''

    This false idea has spread over the Internet. I must admit that I had a sticker on my laptop with that slogan a few years ago. But the truth is if you say that, you are not understanding well what cloud computing is. It is true that, reduced to a minimum, cloud computing is about renting compute power from others for your purposes, an infinite world of possibilities has been raised over this idea with implications at all organizational levels of a company.

    Let’s talk about the advantages

    The economy of scale

    As you surely know, today everything is done on a large scale, especially when we talk about the world of data. For this reason, we must be able to operate less expensively and more efficiently when we do things on a large scale. The cloud takes advantage of the economy of scale, allowing our businesses to grow and be more profitable as they grow.

    Pay-as-you-go

    Another of the many advantages of cloud computing affects the financial level because it changes the spending model. You should understand these spending models well, to know why it is an advantage of the cloud.

    • Capital expenditure (CapEx): they consist of an investment in a fixed asset and then deducting that expense from your tax bill over time. Examples of assets that would fall into this category could be buildings, equipment, or, more specifically, when you buy a server or a data center (On-premise).
    • Operational expenditure (OpEx): can be understood as expenses necessary for the operation of the business. You can deduct this expense from your tax bill in the same year. There are no upfront costs, you pay for what you use.

    Operational expenses enable a pay-as-you-go pricing model, which allows your company to reduce costs and gain flexibility.

    Reduced time-to-market

    Thanks to the cloud, the time-to-market for new products or the growth of the existing ones is reduced.

    If your company, regardless of its size, wants to try a new product, with the cloud you will be able to do so much more agilely, since it allows you to allocate resources in a much faster and more precise way.

    On the other hand, if you already have a product running and want to make it grow to other countries, the cloud will allow you to do it much more efficiently.

    Scalability, elasticity and reliability

    Another advantage of the cloud is closely related to the pay-as-you-go model. In this case, we are talking about scalability and elasticity, which allows your business to constantly adapt to demand. This has two aspects: on the one hand, it prevents you from incurring extra costs when you have wasted infrastructure and, on the other, it allows your business to grow as demand grows, guaranteeing the quality of the service.

    Also, the cloud allows you to increase the reliability of your technology through disaster recovery policies, data replication, or backups.

    Focus on the business

    With the shared responsibility models of cloud providers, you can free yourself from certain responsibilities and put a greater focus on growing your business. There are different cloud models, which we will see below, but I anticipate that depending on the model you choose, the distribution of responsibility will vary.

    It’s not about being carefree. Using technology always carries a great responsibility and many aspects must always be born in mind, especially when we talk about data. However, in any cloud model, there will always be a delegated party to the provider, which will allow you to free yourself to a greater or lesser extent from recurring and costly tasks for your business.

    Security

    I believe that security should always have a separate section. Closely related to economies of scale, security solutions are cheaper when deployed on a large scale, and the cloud takes advantage of this. Security is a key element today, being a differentiator for many clients like you. This demand makes cloud providers put special focus on security.

    Finally, and related to the shared responsibility model, depending on the solutions implemented, the cloud provider usually acquires certain maintenance responsibilities such as the installation of updates, application of security patches, or security implementations at the infrastructure level so you don’t have to worry about these tasks.

    But why does nobody talk about the risks?

    There are always two sides. We have talked about the advantages and I am sure that many, if not all, you would already know. I hope that at this point, you have gathered that the cloud offers you great opportunities whether you are a small, medium, or large company.

    Now, why do so few people talk about risks? There is no perfect solution, so I think it is just as important to know the benefits as it is to talk about the risks. When you have all the information on the table, you can make a decision with a much more objective criterion than just seeing a part.

    Provider dependency

    When you use any technology, a link is established between your business and that technology. A dependency is created that can be higher or lower depending on the technology and the function it has in your business. This dependency gives cloud providers greater bargaining power, as switching costs arise that were not so present before. For example, if we use accounting software or a CRM in the cloud, the switching costs are very high, because they perform very core functions in your business.

    The same happens with the infrastructure, if all your technology infrastructure relies on a specific cloud provider, it gives that provider greater control. For example, if the cloud provider decides to change prices and you have all your infrastructure hosted with that provider, you have two options: either you accept the changes or you incur the cost of infrastructure migration.

    Not all services are available everywhere

    Not all cloud providers offer the same services and the same services from one provider are not available worldwide. You may need to use a service offered by a certain provider that is available in a geographic area that interests you. Now, if you need to scale to other geographic regions, that service may not be available and your ability to act will be limited.

    On the other hand, and related to the previous point, the fact that you use a specific service with a certain provider does not imply that should the time come when you need to change providers, you can do it since not all providers have the same catalog of services.

    As you have seen, the cloud has great potential for your business since it allows you to gain agility, reduce time to market and optimise costs, which, with on-premise solutions, will be much more difficult. However, in addition to the advantages, you must always keep in mind the main disadvantages, since dependencies and change costs that were not so present before may well appear.

     
  • Nederlandse bedrijven maken steeds meer gebruik van software oplossingen uit de cloud

    Nederlandse bedrijven maken steeds meer gebruik van software oplossingen uit de cloud

    Van de Nederlandse organisaties gebruikt 68% een of meerdere software-oplossingen uit de cloud. In 2016 had nog 52% van de organisaties een of meerdere cloudoplossingen in huis. Dit blijkt uit een analyse van Smart Profile van de cloudmarkt waarvoor het 7800 organisaties met meer dan 50 medewerkers ondervroeg. Bij multinationals en de overheid zijn cloud-toepassingen het meest populair. Het onderzoek laat verder zien dat Microsoft met oplossingen die door 39,7% van de respondenten gebruikt worden, de grootste leverancier is van cloud software.

    Van alle SaaS (Software as a Service) oplossingen die gebruikt worden betreft het grootste deel toepassingen voor kantoorautomatisering (24,7%). HR-oplossingen volgen daarna met 18,8%. Verticale toepassingen gericht op bijvoorbeeld het onderwijs of de zorg maken 11,5% uit van alle gebruikte SaaS-oplossingen. Microsoft is met 39,7%, onder de respondenten de grootste leverancier van software uit de cloud. Het gaat hierbij vooral om Office365, SharePoint en Exchange. De plekken twee tot en met vijf worden ingenomen door AFAS (10,2%), RAET (8,1%), Salesforce (5,9%) en Schoolmaster (5,1%).

    Salesforce nadert Oracle

    De analyse van Smart Profile van de eindgebruikersmarkt laat verder zien dat Oracle Eloqua met 31% de meest gebruikte marketing automation oplossing uit de cloud is. Salesforce Pardot volgt echter kort erna: 29,8% van de respondenten die een marketing automation-oplossing uit de cloud gebruikt geeft aan voor Salesforce Pardot te hebben gekozen. Adobe’s oplossing Marketo (15,3%), Hubspot (13%) en Act-on (7,3%) maken de top 5 compleet. De groeiende populariteit van de cloud uit zich ook in de toenemende adoptie van cloud-capaciteit in het datacenter. Overall is het in een jaar bijna verdubbeld. In alle branches is een forse toename te zien van cloud-capaciteit die als IaaS (Infrastrcture as a Service) of PaaS (Platform as a Service) worden afgenomen. De grootste leverancier van public cloud in het datacenter is Microsoft. 79,5% van de organisaties die gebruik maken cloudcapaciteit geeft aan dat ze Azure gebruiken. Amazon volgt met 21,7%.

    Analyse markt cloud-leveranciers

    Naast een analyse van de eindgebruikersmarkt onderzocht Smart Profile ook de markt van cloudleveranciers. Uit deze analyse onder de 1587 belangrijkste partijen wordt onder meer duidelijk welke strategie leveranciers nu hanteren, waarschijnlijk over twee jaar zullen hanteren, en wat de trends zijn. Het volledige onderzoek vindy u hier.

    Bron: BI platform

     

  • On-premise or cloud-based? A guide to appropriate data governance

    On-premise or cloud-based? A guide to appropriate data governance

    Data governance involves developing strategies and practices to ensure high-quality data throughout its lifecycle.

    However, besides deciding how to manage data governance, you must choose whether to apply the respective principles in an on-premise setting or the cloud.

    Here are four pointers to help:

    1. Choose on-premise when third-party misconduct is a prevalent concern

    One of the goals of data governance is to determine the best ways to keep data safe. That's why data safety comes into the picture when people choose cloud-based or on-premise solutions. If your company holds sensitive data like health information and you're worried about a third-party not abiding by your data governance policies, an on-premise solution could be right for you.

    Third-party cloud providers must abide by regulations for storing health data, but they still make mistakes. Some companies offer tools that let you determine a cloud company's level of risk and see the safeguards it has in place to prevent data breaches. You may consider using one of those to assess whether third-party misconduct is a valid concern as you strive to maintain data governance best practices.

    One thing to keep in mind is that the shortcomings of third-party companies could cause long-term damage for your company's reputation. For example, in a case where a cloud provider has a misconfigured server that allows a data breach to happen, they're to blame. But, the headlines about the incident will likely primarily feature your brand and may only mention the outside company in a passing sentence.

    If you opt for on-premise data governance, your company alone is in the spotlight if something goes wrong, but it's also possible to exert more control over all facets of data governance to promote consistency. When you need scalability, cloud-based technology typically allows you to ramp up faster, but you shouldn't do that at the expense of a possible third-party blunder.

    2. Select cloud-based data governance if you lack data governance maturity

    Implementing a data governance program is a time-consuming but worthwhile process. A data governance maturity assessment model can be useful for seeing how your company's approach to data governance stacks up to industry-wide best practices. It can also identify gaps to illuminate what has to happen for ongoing progress to occur.

    Using a data governance maturity assessment model can also signal to stakeholders that data governance is a priority within your organization. However, if your assessments show the company has a long way to go before it can adhere to best practices, cloud-based data governance could be the right choice.

    That's because the leading cloud providers have their own in-house data governance strategies in place. They shouldn't replace the ones used in-house at your company, but they could help you fill in the known gaps while improving company-wide data governance.

    3. Go with on-premise if you want ownership

    One of the things that companies often don't like about using a cloud provider for data governance is that they don't have ownership of the software. Instead, they usually enter into a leasing agreement, similarly to leasing an automobile. So, if you want complete control over the software used to manage your data, on-premise is the only possibility which allows that ownership.

    One thing to keep in mind about on-premise data governance is that you are responsible for data security. As such, you must have protocols in place to keep your software updated against the latest security threats.

    Cloud providers usually update their software more frequently than you might in an on-premise scenario. That means you have to be especially proactive about dealing with known security flaws in outdated software. Indeed, on-premise data governance has the benefit of ownership, but your organization has to be ready to accept all the responsibility that option brings.

    4. Know that specialized data governance tools are advantageous in both cases

    You've already learned a few of the pros and cons of on-premise versus cloud-based solutions to meet your data governance requirements. Don't forget that no matter which of those you choose, specialty software can help you get a handle on data access, storage, usage and more. For example, software exists to help companies manage their data lakes whether they are on the premises or in the cloud.

    Those tools can sync with third-party sources of data to allow monitoring of all the data from a single interface. Moreover, they can track metadata changes, allowing users to become more aware of data categorization strategies.

    Regardless of whether you ultimately decide it's best to manage data governance through an on-premise solution or in the cloud, take the necessary time to investigate data governance tools. They could give your company insights that are particularly useful during compliance audits or as your company starts using data in new ways.

    Evaluate the tradeoffs

    As you figure out if it's better to entrust data governance to a cloud company or handle it on-site, don't forget that each option has pros and cons.

    Cloud companies offer convenience, but only if their data governance principles align with your needs. And, if customization is one of your top concerns, on-premise data governance gives you the most flexibility to make tweaks as your company evolves.

    Studying the advantages and disadvantages of these options carefully before making a decision should allow you to get maximally informed about how to accommodate for your company's present and future needs. 

    Author: Kayla Matthews

    Source: Information-management

  • Part of a Multi-Cloud Organization? Be Aware of these Data Security Implications  

    Part of a Multi-Cloud Organization? Be Aware of these Data Security Implications

    The world is going multi-cloud. Enterprises are leveraging the benefits of multi-cloud services to improve operational efficiency, reduce costs, and drive faster innovation. What does this mean for data privacy? With data residing in multiple locations, it’s more important than ever for organizations to understand their data privacy risks and ensure that any sensitive data is protected. 

    In the previous “mono-cloud” generation, adopting diverse cloud services across different departments (for example, Salesforce for Customer Success, Zendesk for Help Desk, Google Docs for collaboration) enabled businesses to optimize their resources and spend less on IT infrastructure maintenance. However, with so much data being centralized in one place, there were growing concerns about the privacy and security of data. 

    One serious data privacy issue arose from centralized data storage in the cloud. When data was centralized in the cloud, it was highly accessible but also highly vulnerable to security threats, data breaches, and privacy violations. One of the dangers of centralized data storage was the single point of failure. In the event of an outage, users were not able to access critical business data. Another danger was the probability of data breaches, which made it easy for hackers to access it. Also, if the data was not encrypted, it posed a risk to the privacy of customers. 

    To mitigate these issues, businesses started adopting a multi-cloud strategy. This enabled organizations to store data across multiple cloud service providers. This way, if one vendor went down, users could still access critical data from another vendor. In the typical multi-cloud organization, user data is spread across many cloud systems. 

    But here are the primary data privacy challenges of multi-cloud organizations:

    • Data location transparency: It can be difficult for you, the end user, to know exactly where your data is stored. Because many cloud computing providers offer what may appear to be similar services, it can be difficult for organizations to determine which provider hosts a given piece of data. This can make it challenging for businesses to comply with data privacy regulations, retain control over sensitive information, and monitor the security of their data. 
    • Data breaches due to incorrect contacting practices: A second data privacy challenge in the multi-cloud organization is the problem of data breaches emanating from poor contracting practices. If businesses fail to adopt the right multi-cloud strategies, they may not be able to oversee their contracts properly. This can lead to data breaches when their cloud service providers fail to meet certain standards like data sovereignty laws, data protection laws, and so on. To avoid this, businesses can make sure that they are contracting with vendors that meet the legal requirements.

    In short, multi-cloud data management environments bring their own data privacy and security challenges.

    Key Security Challenges and Solutions for Multi-Cloud Organizations

    As multi-cloud adoption continues to rise among global organizations, Gartner has suggested that presently almost 70% of organizations have put a multi-cloud strategy in place. Consequently, one of the biggest concerns for companies operating in the multi-cloud era is data security. Data security is the protection of information, systems, and devices from theft or unauthorized access. In the multi-cloud era, businesses must adopt a strong data security strategy. Here are reasons for this:

    • Businesses are likely to store sensitive data across different cloud service providers. This makes it imperative for businesses to have a strategy to ensure that their data remains protected from breaches in the event of a disaster. 
    • Businesses are legally obligated to protect customer data in case of a data breach. As per GDPR, if customer data gets breached due to negligence on the part of a company, they are liable to pay a hefty fine. 

    The multi-cloud environment brings significant security challenges to organizations. The following are some key security challenges organizations face as they implement multi-cloud strategies. As organizations move forward with a multi-cloud strategy, they are challenged to enforce consistent security configurations across workloads and applications. 

    Challenge 1: One false expectation is that you can just extend on-premises security infrastructure to the cloud. Unfortunately, tools from just one cloud vendor, or your own scripts written for your on-premise data centers, are not going to get you through the challenges of a multi-cloud architecture. You need a cloud-native security platform that allows you to protect different cloud services from multiple providers. 

    Probable solution: It is highly risky to implement the same “data governance, access, and security framework” across multiple clouds. This approach will result in inconsistencies in policy implementations across different cloud service providers and different service environments (SaaS, PaaS, and IaaS). It is far better to allow cloud service providers to deliver service-related security, while organizations, on the other hand, take responsibility for data security within the multi-cloud environment. Cloud service providers should monitor infrastructure-related security threats, while the end users – organizations – secure their data, cloud applications, and other assets on cloud. 

    Challenge 2: A poorly developed multi-cloud security strategy can end up in loss of data integrity confidentiality. Enabling multi-cloud architecture for better security and privacy involves the risk of losing track of data. So, the answer is adopting a “data-centric security approach” within an organization, which ensures that an organization’s most critical assets stay protected regardless of their location: on-premises, on a private cloud, or in a multitude of public cloud service provider environments. With data-centric security, organizations substantially reduce the risks related to regulatory requirements in the multi-cloud.

    Probable solution: Having a complete approach to data privacy and security throughout your organization helps to mitigate costs, complexity, and, in turn, risk. This approach makes it possible to protect data throughout the data lifecycle. Comprehensively managing data encryption, or data masking, for data protection in cloud or on-premises environments is critical. 

    Challenge 3: While many people claim that the cloud platform has built-in, inherent security controls, and that you do not have to bother to implement your own, keep in mind that the cloud is about shared security. For instance, you might be using the services of CrowdStrike for security on the cloud platforms, and Falcon Horizon/Cloud Security Positioning Management (CSPM) for protection against configuration errors.

    Probable solution: While the “shared security approach” enables cloud service providers to ensure the security of certain services, your organization’s internal security teams must take responsibility for the security of others. 

    Challenge 4: Protecting sensitive data in the cloud is an additional challenge for multi-cloud organizations. This means organizations have to routinely revisit and re-engineer their security strategies and tools related to data access to incorporate real-time, continuous monitoring and compliance measures. This becomes challenging when organizations try to support least-privileged access models across all their data stores in the cloud. Generally speaking, enterprises have little control over data exposures and security gaps. 

    Probable solution: Because protecting workloads spread across on-premises and multiple cloud frameworks is especially complex, automation is crucial for monitoring workloads such as VMs and Kubernetes containers distributed over multiple environments – on-premises, mono-cloud, and multi-cloud. Automated solution platforms help keep track of and monitor workloads across systems.

    Challenge 5: This is the most formidable challenge – an acute shortage of qualified security professionals with deep knowledge and experience in working on multiple cloud platforms. Given the lack of trust and experience in this field, all the above-mentioned challenges could result in significant security vulnerabilities. When adopting a cloud strategy, security leaders face challenges like controlling cloud costs, data privacy, and security issues.

    Probable solution: As more organizations shift toward full-cloud adoption, security teams will need the right talent and resources to manage their cloud infrastructures and navigate security and privacy obstacles posed by the cloud.

    Given the range and complexity of privacy and security challenges in the multi-cloud, the security settings must be consistent across all of your clouds. Ongoing communications with cloud service providers is necessary to ensure that all are following the same security measures. Cloud security technologies such as cloud security posture management, cloud workload protection, cloud identity and rights management, data loss prevention, encryption, and multi-factor authentication (MFA) are the most common technologies that should be kept in mind while planning privacy and security for multi-cloud environments. 

    Other Multi-Cloud Issues That Threaten Data Privacy and Security 

    Here are some other concerns unique to the multi-cloud environment, which makes enterprise Data Management an ongoing challenge: 

    • Latency due to distance between the organization’s data center and cloud service providers is a grave concern. This can reduce the speed at which employees can access critical data. 
    • Bandwidth issues can also pose a challenge. If a multi-cloud organization keeps all its critical data with one cloud service provider’s servers, it is likely that bandwidth issues will surface when the amount of data transferred exceeds the provider’s capacity. This can be particularly problematic for businesses that operate in real-time environments, such as healthcare, financial services, or manufacturing businesses. 

    Wrap-Up

    Each cloud platform is different, so even if you successfully understand who has access to what data and workloads, keeping up with vendor updates and new controls requires ongoing monitoring. To run a successful, secure multi-cloud operation, you probably need an external, centralized platform that controls access for users with appropriate permissions.

    A data security strategy for cloud environments requires ongoing, continuous evaluation to ensure data protection, advanced standards compliance, and adherence to all regulatory laws. Data Management practices are required for the regulation of users’ access to sensitive data in the cloud to enhance data privacy and security. 

    Author: Paramita Ghosh

    Source: Dataversity

  • Rackspace: Cloud consultancy heeft de toekomst

    Rackspace: Cloud consultancy heeft de toekomst

    Rackspace ziet de opkomst van de (multi-)cloud als service provider met veel enthousiasme gebeuren. Het bedrijf moet echter wel continu op blijven letten dat het ook daadwerkelijk de juiste diensten aanbiedt, zo stelt Bert Stam. Naast beheer betekent dat ook steeds meer consultancy.

    Het afgelopen decennium is er sprake geweest van een milde identiteitscrisis bij Rackspace. Van oudsher is dit bedrijf namelijk eigenlijk altijd een service provider geweest. Doordat het samen met NASA OpenStack in de markt heeft gezeten, veranderde Rackspace echter tijdelijk in een technology provider, inclusief een eigen public cloud. Inmiddels is de populariteit van OpenStack tanende, onder andere vanwege de enorme groei van partijen zoals AWS, Microsoft Azure en Google Cloud. Dat betekent dus ook dat Rackspace zich weer duidelijker is gaan toeleggen op het zijn van een service provider.

    Het gaat ook goed met Rackspace in deze regio volgens Stam, Sales and Marketing Director Northern Europe, ook al had men hier bij Rackspace niet altijd evenveel focus op. Dat is nu overigens wel aan het veranderen, dus we mogen nog meer groei verwachten, met name in Noord-Europa (waar de Benelux ook bij hoort) en de DACH-regio. Er is namelijk nog veel te winnen hier.

    Meer complexiteit en standaardisatie

    Door de cloud-beweging die Rackspace zelf mede op gang heeft gebracht, is het leveren van diensten aan organisaties er echter niet eenvoudiger op geworden, zo stelt Stam. De versnippering van de IT-omgeving over meerdere locaties brengt allerlei nieuwe uitdagingen met zich mee. 'Maar we willen nog wel altijd de fanatical support bieden waar we bekend om staan, ook in de steeds ingewikkelder wordende wereld'.

    Het klinkt wellicht wat vreemd, maar om de toegenomen complexiteit te lijf te gaan, is er behoefte aan meer standaardisatie. Zonder standaardisatie kun je immers niet of lastig schalen. Aan de andere kant zijn meer op maat gemaakte omgevingen ook een logisch gevolg van de versnippering in meerdere clouds. Dat is best paradoxaal, zo geeft ook Stam toe, maar wel de realiteit. Het is aan Rackspace om hier een oplossing voor te verzinnen.

    Kijken we naar de diensten die Rackspace aanbiedt voor AWS (aangezien we dit gesprek bij de AWS Summit hebben), dan valt op dat men daar duidelijk probeert om een gulden middenweg te vinden tussen complexiteit en standaardisatie. Aviator (de managed service voor AWS van Rackspace) wordt niet (meer) als monoliet aangeboden, maar ook weer niet opgebroken in te veel kleine onderdelen. Er is geprobeerd om de bouwblokken van deze dienst zo groot mogelijk te maken.

    Stam realiseert zich overigens terdege dat dit een tamelijk vloeibaar geheel is en met het verstrijken van de tijd kan veranderen. Uiteindelijk is het allemaal onderdeel van de gewenning die de cloud nog altijd van gebruikers vraagt. Waar het soms als de oplossing voor alle IT-problemen gezien wordt, zien anderen het er juist helemaal niet in zitten. De waarheid zal ongetwijfeld ergens in het midden liggen.

    Niet alleen support, ook consultancy

    Het leveren van professional services is een belangrijke component voor Rackspace om complexiteit en standaardisatie met elkaar te verbinden. Het draait meer en meer om wat je vooraf met een klant doorspreekt. Waarom wil dit bedrijf naar de cloud? Is er wel een goede business-case?

    Een andere vraag die ook altijd gesteld moet worden voor je de overstap met (een deel van) je IT-omgeving naar de cloud maakt, is hoe het is gesteld met de security. 'Je moet de landing zone voor de applicaties goed op orde hebben', stelt Stam. Je kunt niet beginnen met een migratie zonder er zeker van te zijn dat deze landing zone geregeld is. 'En zelfs dan moet je altijd goed blijven nadenken over wat wel en niet mogelijk en wenselijk is'.

    Technisch gezien is er namelijk van alles mogelijk, maar het moet wel meerwaarde bieden. Je kunt bijvoorbeeld heel erg graag applicaties en workloads naar de cloud willen migreren, maar als deze hard in een on-premise Oracle- of Microsoft-database zijn gecodeerd, dan is het maar de vraag of je dat moet willen. Of je moet ook meteen de hele database naar de cloud overzetten. Dit soort lift-en-shift is prima mogelijk, maar het moet ook commercieel aantrekkelijk en haalbaar zijn. Dat zal doorgaans niet het geval zijn als je min of meer verplicht wordt om je volledige database mee te migreren. En dan hebben we het nog niet over hoe het een en ander uiteindelijk presteert in de cloud. Ook daar moet je vooraf al een goede inschatting over kunnen maken.

    Richting de toekomst

    Onder de streep start men gesprekken bij Rackspace tegenwoordig dus niet meer alleen met praten over de diensten die het bedrijf kan leveren. Ze willen een stap terug nemen en beginnen bij de applicaties. Wat wil een klant daarmee aanvangen? Is er interesse in DevOpsbijvoorbeeld? Dat heeft een impact op hoe je deze dan optimaal migreert.

    Je zou de huidige klantbenadering van Rackspace kunnen zien als een vorm van co-innovatie met de klanten. Het wordt wat dat betreft erg interessant om te zien hoe het bedrijf dit verder vorm gaat geven. Want fanatical support willen blijven leveren in een steeds complexer wordende wereld is mooi, maar er moet ook genoeg aan verdiend kunnen worden. Om complexiteit steeds beter het hoofd te kunnen bieden, wordt er simpelweg steeds meer gevraagd van de service providers. Daar moet een balans in gevonden worden. We zullen Rackspace in ieder geval goed blijven volgen om te zien hoe men die balans uiteindelijk gaat vinden.

    Auteur: Sander Almekinders

    Bron: CIO

     

  • Recommendations for transitioning your business to a hybrid cloud environment successfully

    Recommendations for transitioning your business to a hybrid cloud environment successfully

    Why move to the cloud, and if you do, how can you do so smoothly? BMC's Bill Talbot, VP for solutions marketing, shares some suggestions for making a successful transition.

    From your perspective, what's driving enterprises to the cloud?

    Bill Talbot: It's simple: efficiency. Enterprise customers are increasingly moving to the cloud because they want to be more agile, accelerate innovation faster, and most importantly to make operations run smoother in general. The cloud has reached the point where it's no longer an emerging category. Instead, it's now part of the mainstream. A recent report by 451 Research found a majority (90%) of organizations surveyed are using a cloud service of some sort. Pointing even further to growing adoption and cloud maturity, 69% of enterprises surveyed planned to work in multicloud and hybrid IT environments this year.

    If the cloud is so beneficial, why should an enterprise not move everything to the cloud? Why should a CIO consider a hybrid environment instead?

    When it comes to adopting the cloud, organizations shouldn't do a complete overhaul and move everything to the cloud. Instead, they should carefully consider their goals and choose specific areas where transitioning to the cloud makes business or technical sense. A hybrid cloud environment enables organizations to balance control and flexibility, meaning enterprises have the option to choose the IT resources they purchase based on what they think is best for the business.

    For example, CIOs should consider the demand or transaction load an application requires. If demand is high and causes big, periodic spikes that require additional resources for short periods, developing a cloud app with auto-scaling services could be the best strategy. On the other hand, if the load is relatively consistent, it may be more cost effective to run it on on-premises resources and reserve operational budget for other needs.

    What are the most important considerations for any CIO thinking about moving their company to a hybrid cloud environment?

    Top considerations are speed to innovation, cost, security, and scale. They must influence the decision to invest in a cloud-based service. CIOs evaluating a potential transition to the cloud must look at it two ways: first, as a business decision and, second, as a technology decision.

    Before committing to the technology, CIOs need to clearly outline their business objectives, implementation plan, timeline, and costs to ensure it's the best option. Beyond this, CIOs need to consider regulatory and compliance guidelines and ensure the cloud services adhere to these requirements. Security is a growing challenge for CIOs as they transfer applications and infrastructure to the public cloud. It's important to understand that as a cloud buyer, you're responsible for securing the cloud services you purchase.

    How does the cloud affect speed to innovation?

    Cloud service providers offer resources and technology services that can take significant time to build or acquire in the data center, such as machine learning algorithms. Cloud services are also more easily accessible, supporting and accelerating agile development processes and digital services to keep businesses innovating at a faster rate. In this time of mass digital transformation, organizations find themselves challenged with growing requests that IT operations simply cannot keep up with in a sustainable, long-term way. With seemingly on-demand cloud services now available, organizations are empowered to accelerate innovation and offer new products and services to customers faster.

    Costs are often touted as a big advantage for moving to the cloud, everything from lower upfront capital expenditures to staff savings. What else should a CIO think about when it comes to costs?

    CIOs now have a choice between spending operational or capital budget and need to consider which is best for their business. Managing operational budgets for cloud services requires new tools and discipline. CIOs should embrace predictive analytics and machine learning tools to forecast and manage their cloud spend and budget. By embracing these technologies, cloud ops teams can anticipate what typical cloud usage and spend will be, and even get alerted if they're tracking beyond their allocated resources.

    CIOs can also leverage these tools to rank cloud projects based on cost, from highest to lowest, which helps them easily identify resources that should be consolidated, or possibly eliminated, to keep costs on target.

    How is scale relevant here? Isn't scale only an important consideration for the largest enterprises?

    Scale is important to any enterprise and on multiple levels. At the macro level, there is the scale or large number and variety of infrastructure resources needed to support the many business services of a large enterprise. On a more micro level, scale also applies to the volume of data collected and processed or analyzed, as well as to growing or erratic changes in demand for a specific application or business service.

    Cloud services can address all of these scenarios. Scalability is also pushing organizations to acquire the right cloud management tools to predict changes that require scale and to understand the costs associated.

    Author: James E. Powell

    Source: TDWI

  • Research details developments in the business intelligence (BI) market that is estimated to grow at 10% CAGR to 2020

    HOIThe global business intelligence market report, an analyst says In the past few years, social media has played critical roles in SMEs and mid-sized organizations. Many SMEs are increasingly embracing this trend and integrating their BI software with social media platforms.

    Market outlook of business intelligence market - market research analyst predicts the global business intelligence market to grow at a CAGR of around 10% during the forecast period. The growing adoption of data analytics by organizations worldwide is a key driver for the growth of this market.

    The majority of corporate data sources include data generated from enterprise applications along with newly generated cloud-based and social network data. business intelligence tools are useful in the retrieval and analysis of this vast and growing volume of discrete data.

    They also help optimize business decisions, discover significant weak signals, and develop indicator patterns to identify opportunities and threats for businesses.

    The increased acceptance of cloud BI solutions by SMEs is also boosting the growth of this market. The adoption of cloud services allows end-users to concentrate on core activities rather than managing their IT environment.

    Cloud BI solutions enable applications to be scaled quickly, can be easily integrated with easy integration with third-party applications, and provide security at all levels of the enterprise IT architecture so that these applications can be accessed remotely.

    Market segmentation by technology of the business intelligence market:

    • Traditional BI
    • Mobile BI
    • Cloud BI
    • Social BI

    The mobile BI segment accounts for approximately 20% of the global BI market. It enables the mobile workforce to get business insights by data analysis, using applications optimized for mobile and smart devices.

    The growing smartphone adoption is likely to emerge as a key growth driver for this segment during the forecast period.

    Market segmentation by deployment of the business intelligence market

    • Cloud BI
    • On-premises BI

    The on-premise segment accounted for 86% of the market share during 2015. However, the report anticipates this segment to witness a decline in its shares by the end of the forecast period.

    In this segment, the software is purchased and installed on the server of an enterprise. It requires more maintenance but is highly secure and easy to manage.

    Geographical segmentation of the BI market

    • Americas
    • APAC
    • EMEA

    The Americas dominated the market during 2015, with a market share of around 56%. The high adoption of cloud BI solutions in this region is the major growth contributor for this market.

    The US is the market leader in this region as most of the key vendors are based out of here.

    Competitive landscape and key vendors

    Microsoft is one of the largest BI vendors and offers Power BI, which helps to deliver business-user-oriented, self-service data preparation and analysis needs through Excel 2013 and Office 365. The competitive environment in this market is expected to intensify during the forecast period due to an increase in R&D innovations and mergers.

    The market is also expected to witness a growing trend of acquisitions by the leading players. The key players in the market are expected to diversify their geographical presence during the forecast period.

    The key vendors of the market are -

    • IBM
    • Microsoft
    • Oracle
    • SAP
    • SAS Institute

    Other prominent vendors in the market include Actuate, Alteryx, Board International, Brist, Datawatch, GoodData, Infor, Information Builders, Logi Analytics, MicroStrategy, Panorama Software, Pentaho, Prognoz, Pyramid Analytics, Qlik, Salient Management Company, Tableau, Targit, Tibco Software, and Yellowfin.

    Key questions answered in the report

    • What will the market size and the growth rate be in 2020?
    • What are the key factors driving the BI market?
    • What are the key market trends impacting the growth of the BI market?
    • What are the challenges to market growth?
    • Who are the key vendors in the global BI market?
    • What are the market opportunities and threats faced by the vendors in the BI market?
    • Trending factors influencing the market shares of the Americas, APAC, and EMEA?
    • What are the key outcomes of the five forces analysis of the BI market?

    Source: WhaTech

  • Security Concerns Grow As Big Data Moves to Cloud

    red-hacked-symbol-200x133Despite exponential increases in data storage in the cloud along with databases and the emerging Internet of Things (IoT), IT security executives remain worried about security breaches as well as vulnerabilities introduced via shared infrastructure.

    A cloud security survey released Wednesday (Feb. 24) by enterprise data security vendor Vormetric and 451 Research found that 85 percent of respondents use sensitive data stored in the cloud, up from 54 percent last year. Meanwhile, half of those surveyed said they are using sensitive data within big data deployments, up from 31 percent last year. One-third of respondents said they are accessing sensitive data via IoT deployments.

    The upshot is that well over half of those IT executive surveyed are worried about data security as cloud usage grows, citing the possibility of attacks on service providers, exposure to vulnerabilities on shared public cloud infrastructure and a lack of control over where data is stored.

    Those fears are well founded, the security survey notes: “To a large extent both security vendors and enterprises are like generals fighting the last war. While the storm of data breaches continues to crest, many remain focused on traditional defenses like network and endpoint security that are clearly no longer sufficient on their own to respond to new security challenges.”

    Control and management of encryption keys is widely seen as critical to securing data stored in the cloud, the survey found. IT executives were divided on the question of managing encryption keys, with roughly half previously saying that keys should be managed by cloud service providers. That view has shifted in the past year, the survey found, with 65 percent now favoring on-premise management of encryption keys.

    In response to security concerns, public cloud vendors like Amazon Web Services, Google, Microsoft and Salesforce have moved to tighten data security through internal development, partnerships and acquisitions in an attempt to reduce vulnerabilities. Big data vendors have lagged behind, but the survey noted that acquisitions by Cloudera and Hortonworks represent concrete steps toward securing big data.

    Cloudera acquired encryption and key management developer Gazzang in 2014 to boost Hadoop security. Among Hortonworks’ recent acquisitions is XA Secure, a developer of security tools for Hadoop.

    Still, the survey warned, IoT security remains problematic.

    When asked which data resources were most at risk, 54 percent of respondents to the Vormetric survey cited databases while 41 percent said file servers. Indeed, when linked to the open Internet, these machines can be exposed vulnerabilities similar to recent “man-in-the-middle” attacks on an open source library.

    (Security specialist SentinelOne released an endpoint platform this week designed to protect enterprise datacenters and cloud providers from emerging threats that target Linux servers.)

    Meanwhile, the top security concerns for big data implementations were: the security of reports that include sensitive information; sensitive data spread across big data deployments; and privacy violations related to data originating in multiple countries. Privacy worries have been complications by delays in replacing a 15-year-old “safe harbor” agreement struck down last year that governed trans-Atlantic data transfers. A proposed E.U.-U.S. Privacy Shield deal has yet to be implemented.

    Despite these uncertainties and continuing security worries, respondents said they would continue shifting more sensitive data to the cloud, databases and IoT implementations as they move computing resources closer to data. For example, half of all survey respondents said they would store sensitive information in big data environments.

    Source: Datanami

  • The Challenges of Moving to a Cloud Environment

    The Challenges of Moving to a Cloud Environment

    While no business could have fully prepared for the COVID-19 pandemic’s impact, those with strong cloud-based strategies have been able to adapt to the remote work reality. But even for companies that have migrated to the cloud or are in the process, a dispersed workforce presents challenges when you consider the trade-off between a streamlined, cohesive work process and network security. Despite this, the move from on-premise to cloud-based solutions isn’t slowing, making cloud migration still desirable.

    In fact, recent research points to increasing public cloud adoption over the next year, even amid, or perhaps a result of, the pandemic and an overall downturn in IT spending. According to Instinet, 68% of CIOs indicated cloud services would become more of a priority for their businesses and reported a reduction in on-premise workloads, from 59% relying on on-prem assets in 2019 to an estimated 35% by 2021.

    For businesses, cloud and SaaS services offer an easy way for employees to collaborate and access the information they need outside the confines of a physical office space. For employees, these solutions are desirable in part because they’re so easy to use. When not sanctioned through an employer, all it takes is an email or credit card to sign up, and an employee can start a CRM package, open a Dropbox, or create an iCloud account, and a range of other activities. While it sounds benign, any of these services could be a place for sharing company information, from trade secrets, to intellectual property, and personally identifiable information.

    In order to enable employees to get work done and safeguard sensitive information, organizations must find a way to both connect and manage systems and access. Cloud migration is a big undertaking, and far too often organizations overlook what a crucial part identity governance plays in implementing successful and sustainable cloud migration initiatives. By baking identity governance into your plan from the get-go you can avoid some of the main security pitfalls of transitioning to the cloud.

    One major challenge is employee buy-in. It may sound counterintuitive, as the cloud is meant to streamline work processes, but learning new systems and working out permissions can be a learning curve company’s need to account for. People want to get work done as quickly and efficiently as possible, and adding another roadblock for them to access what they need can result in bypassing security protocols. Organizations who have not already should implement safeguards like multi-factor authentication (MFA), but also consider making the second form of identity something easy to access, like a code sent to a mobile device or something the person has at all times versus a security question or a physical token they need to remember.

    A good cloud migration strategy is not just about wrangling your employees, though—it’s about choosing your cloud partner wisely. When you rely on cloud solutions, you’re entrusting another party with your valuable customer and company data. Even if the information is compromised under their care, it’s your business that will pay the price at both a financial and reputational cost. Before embarking on a cloud journey be clear about your prospective cloud provider’s security practices, and don’t just make them tell you—have them show you. Ask where your data will be stored, what measures they take to protect it, and what practices they use to keep it secure.

    Another challenge beyond vendor selection and employee onboarding is simply keeping up with the pace of technology. The last few years have looked like an arms race to the cloud, and as a result, a lot of projects fail. Migrating all your data with different levels of sensitivity and access privilege should be done intentionally, and many bite off more than they can chew. This causes mistakes and headaches in the long run, and the worst part is, it’s easily avoidable. Leverage third-party resources that have identity expertise, such as an outside consultant or an analyst firm to help you define your cloud requirements. Make sure stakeholders—leadership, investors, department heads, etc.—are involved in executing cloud projects, as they span the business.

    The work doesn’t stop there, though. Once you do have a solid strategy, select a vendor to partner with, and start onboarding and training employees, think ahead about how you’ll maintain a healthy security posture. Consider using a cloud access security broker, an independent software that sits between cloud service users and cloud applications, and monitors all activity, or an ethical hacker to help identify weak areas and enforce security policies. For highly-regulated industries, such as healthcare and life sciences or finance, managing evolving threats becomes especially important. By not complying with strict laws and requirements to protect sensitive information, you could be setting yourself up for a world of hurt.

    Security is a top reason that organizations stall their cloud endeavors—and for good reason. However, with the promise of better IT processes, increased productivity and collaboration, and a host of other benefits, the challenges of cloud migration far outweigh the risks. Success takes due diligence and a digestible strategy to prevail, so be sure to do the homework, tweak as you go, and remember, it’s a marathon, not a sprint.

    Author: Jackson Shaw

    Source: Open Data Science

  • The most wanted skills related for organizations migrating to the cloud

    The most wanted skills related for organizations migrating to the cloud

    Given the widespread move to cloud services underway today, it’s not surprising that there’s growing demand for a variety of cloud-related skills.

    Earlier this year, IT consulting and talent services firm Akraya Inc. compiled a list of the most in-demand cloud skills for 2019, let's take a look at them:

    Cloud security

    Cloud security is a shared responsibility between cloud providers and their customers. That creates a need for professionals with specialization in cloud security skills, including those who can leverage cloud security tools.

    Machine learning (ML) and artificial intelligence (AI)

    In recent years cloud vendors have developed and expanded their set of tools and services that allow organizations to reap the benefits of machine learning and artificial intelligence in the cloud. Companies need people who can leverage these new capabilities of the cloud.

    Cloud migration and deployment within multi-cloud environments

    Many organizations are looking to adopt multiple cloud services, and are looking for professionals who can contribute to their cloud migration efforts. Cloud migration has its risks and is not an easy process, and improper migration processes often lead to business downtime and data vulnerability. This means that employees with appropriate skillset are key.

    Serverless architecture

    Underlying cloud server infrastructure needs to be managed by cloud developers within a server-based architecture. But today’s cloud consists of industry standard technologies and programming languages that help move serverless applications from one cloud vendor to another, Akraya said. Companies need expertise in serverless application development.

    Author: Bob Violino

    Source: Information-management

  • Toegang tot RPA-bots in Azure dankzij Automation Anywhere

    Toegang tot RPA-bots in Azure dankzij Automation Anywhere

    Automation Anywhere heeft het mogelijk gemaakt om toegang te krijgen tot zijn Robotic Process Automation (RPA)-bots vanuit Azure. Het bedrijf stelt dat er een uitgebreide samenwerking is opgezet met Microsoft, die gezamenlijke productintegratie, gezamenlijke verkopen en gezamenlijke marketing mogelijk moet maken. 

    Automation Anywhere koos daarnaast voor Azure als zijn cloud-provider, waardoor gezamenlijke klanten altijd en overal toegang hebben tot automatiseringstechnologie, schrijft idm. Organisaties kunnen het RPA-platform van Automation Anywhere op Azure, on premise en in een public of private cloud hosten.

    Alysa Taylor, Corporate Vice President van Cloud and Business bij Microsoft Business Applications and Industry, stelt dat de visie van Automation Anywhere overeenkomt met die van Microsoft. 'Dat is de visie om data en intelligence in al onze producten, applicaties, diensten en ervaringen te stoppen'. Volgens Mihir Shukla, CEO en mede-oprichter van Automation Anywhere, stelt de samenwerking bedrijven in staat om efficiënter te worden, de kosten via automatisering te verlagen en om werknemers de kans te geven om te focussen op wat ze het beste doen.

    Microsoft gaat op zijn beurt de automatiseringsproducten van Automation Anywhere uitlichten in zijn Executive Briefing Centres wereldwijd. Daardoor kunnen klanten hands-on demonstraties krijgen van Microsoft-producten, die mogelijk worden gemaakt door Automation Anywhere-technologie.

    Oracle

    Automation Anywhere kondigde in april aan ook een samenwerking te hebben opgezet met Oracle Integration Cloud. De twee bedrijven willen intelligente automatisering versnellen en de adoptie van door kunstmatige intelligentie aangedreven software-bots in de Integration Cloud mogelijk maken.

    Met het RPA-platform van Automation Anywhere moet het voor klanten van de Oracle Integration Cloud mogelijk worden om complexe zakelijke processen te automatiseren, zodat de werknemers zich op werk met meer waarde kunnen focussen. Ook moet de organisatorische efficiëntie verhoogd worden.

    Als onderdeel van de samenwerking wordt de enterprise RPA-platform connector voor de Integration Cloud beschikbaar. Oracle-klanten krijgen verder toegang tot de software-bots van Automation Anywhere, en de twee bedrijven werken samen aan extra bot-creaties speciaal voor Oracle. Die bot-creaties moeten beschikbaar worden in de Automation Anywhere Bot Store.

    Auteeur: Eveline Meijer

    Bron: Techzine

  • Useful advice for entrepreneurs who consider starting a microbusiness

    Useful advice for entrepreneurs who consider starting a microbusiness

    Right now, with inflation and interest rates rising, many aspiring entrepreneurs are reconsidering plans to launch new businesses. It is, they believe, the wrong time to go all-in on a new venture. And for those with grand plans and ambitions, that may be true. But that doesn’t mean entrepreneurs have no choice but to sit on the sidelines until economic conditions improve.

    Instead of trying to start a resource-intensive new startup, they can instead work toward founding a microbusiness. Microbusinesses are, after all, the most common type of business in the US, accounting for 74.8% of all private-sector employment. And they’re quite easy to start. Here are four technology tips for entrepreneurs wishing to try their hands at starting a microbusiness.

    Embrace Automation Early On

    One of the biggest keys to running a successful microbusiness is to always look for ways to do more with less. And there’s no better way to accomplish that than by turning to automation. That’s why entrepreneurs should embrace automation early on in their business journey and look for ways to utilize it in every aspect of their operation. After all, time is money, and in a microbusiness – every dollar saved goes straight into the entrepreneur’s pocket.

    Some of the most effective types of automation for microbusinesses include:

    • Social media and marketing automation
    • Accounting automation
    • Scheduling automation
    • Form automation

    The idea is to put as many of the day-to-day tasks involved with running the microbusiness on autopilot – freeing up the entrepreneur to handle work that leads directly to revenue.

    Leverage Virtual Office Technology

    For most businesses, the two biggest cost drivers are labor and real estate. Microbusinesses don’t have high labor costs, and they can all but eliminate real estate costs by virtualizing their offices. In most cases, there’s no business justification for maintaining a physical office space. Instead, an entrepreneur behind a microbusiness can go fully remote and connect with other employees or freelancers via video, voice, or text chat when necessary.

    But to make a virtual office setup work, it’s important to come up with ways to maintain customer relationships without the physical component they often have in a traditional office environment. In other words, it’s necessary to replace face-to-face meetings with other, all-digital customer outreach efforts. Some of the best ways to do so include:

    • Holding invite-only live customer video events
    • Providing live online training and knowledge-sharing sessions
    • Organizing virtual networking events

    Be a Cloud-native Business

    Flexibility is one of the key assets of a microbusiness that allows them to compete in today’s economy. It means they can often be nimbler and customer-responsive than bigger competitors. That makes it easy for microbusinesses to innovate and bring new products and services to market as fast as possible. But flexibility doesn’t equal capability.

    To make the most of their flexibility, microbusinesses have to be able to develop new capabilities at a moment’s notice and discard parts of their operations that are no longer viable. But that gets increasingly harder to do if there are significant technology investments involved. The solution, though, is simple: don’t own your technology stack.

    Instead, microbusinesses should aim to be cloud-native. A cloud-native business eschews costly on-premises hardware and software solutions. In their place, they rely on contracted access to cloud-based solutions that they can add to or subtract from at a moment’s notice. For example, instead of maintaining a costly mail server to support communications, they’d perhaps start with a free business email address and then turn to a cloud email provider like Outlook.com or Gmail to scale up when the need arose.

    The reason this is so important is that it can help a microbusiness keep their costs and capabilities closely aligned with the work they’re doing at a given moment. It’s a way of eliminating unnecessary spending during lean times while preserving the ability to pivot when clients come calling with new requests.

    Consider Hardware as a Service

    Last but not least, any microbusiness would do well to try and contain its IT hardware costs by exploring hardware as a service (HaaS) options. They’re typically made available by managed service providers (MSPs) as an option for their customers. By leasing IT equipment rather than buying it, microbusinesses can cut major capital costs that they’d ordinarily face to get up and running.

    Instead, they pay predictable monthly fees to cover their technology needs and don’t have to worry about complex depreciation formulas and the like. Plus, the costs of most HaaS offerings are cheaper than the cost of purchases when you consider the short replacement cycle of business technologies these days. Plus, payments for leased hardware are almost always 100% tax-deductible, making them the perfect option for a lean-running microbusiness.

    The Right Business at the Right Moment

    Right now, as economic uncertainty mounts, microbusinesses are the perfect entrepreneurial option to meet the moment. And the technology tips detailed above should help any entrepreneur to launch a lean, nimble, and profitable microbusiness no matter how strong the economic headwinds become. By putting them to work early on, they’ll help to contain costs and improve capabilities at every turn – leading to a successful launch and a sustainable future.

    Author: Philip Piletic

    Source: Datafloq

  • Verschillende perspectieven om te kijken naar de transitie richting de cloud

    Verschillende perspectieven om te kijken naar de transitie richting de cloud

    Onlangs belegde CIO samen met Juniper Networks een bijeenkomst in de oude verkeerstoren in Schiphol-Oost. Samen met zeven genodigden bespraken ze, heel toepasselijk in deze omgeving, de gang naar de cloud. Daarover verschillen de meningen nogal, zo bleek.

    We voeren de discussie aan de hand van drie stellingen. De eerste heeft te maken met puur de connectiviteit van een netwerk, de tweede met de security en de derde met de automatisering. Op deze manier beginnen we met de basis, de connectiviteit, waarna we kijken hoe dit beveiligd moet worden en tot slot hoe connectiviteit en security kunnen worden geautomatiseerd.

    Stelling 1: Een netwerk is een netwerk, ongeacht het deployment model

    Deze eerste stelling bespreken we met Naomi Du Burck, manager front office IT Operations bij de ANWB in Den Haag en met Peter Verdiesen, als hoofd ICT werkzaam bij Countus accountants en adviseurs uit Zwolle. We treffen hierbij meteen twee mensen uit bedrijfstakken waar men bij het horen van public cloud vooral uitdagingen ziet. Het maakt voor beide organisaties dus nog wel degelijk uit hoe het netwerk eruitziet.

    De AVG is voor beide gesprekspartners zonder twijfel de meest voor de hand liggende rode vlag als het gaat om de cloud. Zowel de ANWB als Countus hebben te maken met veel data van leden/cliënten. 'Dat maakt cloud in het algemeen tamelijk ingewikkeld', zijn beiden het eens. Ze mogen hun data niet zomaar in de cloud zetten. De ANWB heeft daarnaast veel legacy. 'Dan valt de optie voor public in de huidige situatie eigenlijk af', aldus Du Burck.

    Het netwerk en de infrastructuur als geheel is voor zowel de ANWB als Countus ondergeschikt aan hoe men om mag gaan met de data van gebruikers. Dit geldt ook voor de telemetrie en analytics die je uit het netwerk kunt halen als je de zaken continu monitort. Verdiesen: 'We willen hier wel graag bedrijfsbreed mee aan de slag, maar mogen dit niet zonder extra AVG-maatregelen die het veel complexer maken'. Hij ziet namelijk wel degelijk dat hier zeer waardevolle inzichten uit gehaald kunnen worden.

    Het is overigens niet zo dat analytics per definitie niet mag: 'Een op een mag het wel, tussen een enkele klant en Countus, maar opschalen mag niet zonder toestemming van alle betrokkenen', aldus Verdiesen. Tot de wetgeving aangepast wordt, zitten ze hieraan vast, is zijn conclusie.

    Ook al durft Du Burck te stellen dat 'de ANWB nooit volledig public gaat', wordt er natuurlijk wel degelijk naar een hybride vorm gekeken. Denk hierbij aan het verzorgen van de werkplekken voor werknemers. Office 365, maar ook het op afstand gebruik kunnen maken van de omgeving van de ANWB is iets waar men mee bezig is. Men moet immers wel mee in de ontwikkelingen rondom een moderne werkplek.

    Tot slot geeft Du Burck aan dat er bij de overgang naar een public cloud nog veel meer zaken onderzocht, afgesproken en gewijzigd moeten worden voordat dit gerealiseerd kan worden. Denk bijvoorbeeld maar aan wijzigingen in governance, beheer, beleid, budgetten en ga zo maar door.

    Stelling 2: Private betekent meer controle op het gebied van privacy en compliancy

    De tweede stelling bespreken we met Erik van der Saag, sectormanager ICT bij de Tabijn scholengroep en met Duncan Megens, evenals Du Burck werkzaam bij de ANWB, maar dan als manager backend - IT operations. We hebben hier twee radicaal van elkaar verschillende organisaties aan tafel. Bij Tabijn zit men volgens Van der Saag al voor 80% in de cloud, terwijl dat bij de ANWB nog geen 10% is.

    We raken al snel verzeild in een welhaast filosofische discussie over wat we verstaan onder controle. Megens: 'Wat bedoel je met controle? Dat kan ik namelijk op meerdere manieren interpreteren. Gaat het dan om de theoretische controle die je hebt over je netwerk, of over hoe goed je het daadwerkelijk onder controle hebt? Dat is namelijk nogal een verschil, ook voor de business als geheel'.

    Als voorbeeld pakken we Office 365. Bij Tabijn is men bezig om de overstap te maken en ook bij de ANWB zijn de voorbereidingen voor de overgang in gang gezet. 'Je geeft wel degelijk controle op als je naar Office 365 gaat', volgens Van der Saag, iets waar Megens het roerend mee eens is. Toch heb je er gezien het cloud-karakter van de dienst juist wel weer controle over als je kijkt naar patching en dergelijke, afhankelijk van hoe je het inricht.

    Onder de streep heeft het volgens beide heren weinig zin om het over controle te hebben als je het niet hebt over hoe iets ingericht is. Van der Saag geeft als voorbeeld het exporteren van gegevens van leerlingen naar de cloud. Dat doet Tabijn zeker, maar er zijn hier wel voorwaarden waaraan men zich moet houden: 'Gegevens van leerlingen mogen niet zomaar meer geëxporteerd worden. Daar moet weer een laag tussen die ervoor zorgt dat de data ook veilig zijn'. Uiteindelijk heeft Tabijn hier in de cloud net zoveel controle op het gebied van privacy en compliancy als het on-premise zou hebben.

    De conclusie van deze discussie is dan ook dat de stelling niet per definitie waar is. Als je het netwerk en de infrastructuur goed inricht, maakt het niet uit waar je data staan en waar je applicaties draaien. Dit is uiteindelijk ook een kwestie van vertrouwen. Vaak is er nog altijd het gevoel dat data minder veilig zijn buiten de muren van je eigen omgeving, maar dat hoeft niet per sé zo te zijn. Dit zal ongetwijfeld ook te maken hebben met een generatiekloof, dus op termijn zal het gevoelsargument minder vaak gemaakt worden.

    Stelling 3: Ontzorgen doe je in de cloud

    Voor de derde en laatste stelling schuiven we aan bij Daniel Treep, architect bij KPN en bij Martijn Jonker van Andarr Technology. KPN zal weinig introductie behoeven, Andarr is een bedrijf dat naar eigen zeggen 'niet voor watjes' is en biedt ICT-consultancy en detacheringsdiensten aan organisaties.

    We zijn het er aan tafel vrij snel over eens dat ontzorgen wellicht niet de best gekozen term is. Als je vanuit private een transitie gaat maken, dan hebben de meeste klanten juist het gevoel dat er juist complexiteit toegevoegd wordt, zeker als het gaat om het netwerk. Je moet ineens allerlei verschillende platformen in een netwerkarchitectuur zien te gieten. Je kunt dan moeilijk zeggen dat je ontzorgd bent, eerder het tegenovergestelde. Je zou dit overigens ook kunnen zien als een transitiefase, waar je even doorheen moet. Dat menen we in ieder geval te proeven uit de opmerking van Jonker dat 'het aanbieden van alles in de cloud de ultieme droom is qua volwassenheid'.

    Volgens Treep maakt het nogal iets uit wat je afneemt in de cloud. Bij SaaS neem je simpelweg een totale dienst af, die jou als het goed is ontzorgt. Daar wil Jonker overigens wel meteen een kanttekening bij plaatsen, want er is maar weinig vastgelegd over hoe services aangeboden moeten worden in de cloud. 'Als het goed is, wordt een dienst zo ingericht dat je er bijvoorbeeld niet zonder wachtwoord bij kan, maar er is geen enkele verplichting om dat ook te doen'. De zorgen kunnen dus niet volledig het raam uit bij het afnemen van een SaaS, volgens Jonker.

    In tegenstelling tot SaaS, heb je volgens Treep bij PaaS en IaaS nagenoeg dezelfde zorgen als bij andere deployment-modellen. Daar is Jonker het mee eens: 'een programmeur kan ermee doen wat hij wil en een enorm datalake creëren waar je geen overzicht meer over hebt'.

    Volgens Treep is het onder de streep eenvoudig als het gaat om het uit handen nemen van zaken in de cloud. 'Hoeveel controle krijg je over het platform? Daar draait het uiteindelijk om'. Heb je veel controle, dan kun je er ook voor zorgen dat je het zodanig inricht dat je er weinig zorgen over hebt. Automation speelt hierin een duidelijke rol: 'Automation is de basis van welke cloud-benadering dan ook'.

    Automation en ontzorging hebben als zodanig het nodige met elkaar te maken, dus in die zin zou je kunnen zeggen dat de stelling conceptueel hout snijdt, ook al maken beide heren de nodige kanttekeningen.

    Interpreteer je ontzorging zo dat al je zorgen voorbij zijn als IT manager, dan kom je toch van een koude kermis thuis, denken beide heren. 'Je ruilt je huidige zorgen in voor andere zorgen in de cloud', is de duidelijke conclusie.

    Conclusie: Verschillende tempo's en einddoelen

    Discussies zoals we die hierboven hebben beschreven, tussen managers van uiteenlopende organisaties, leveren altijd een mooie dwarsdoorsnede van de markt op. Als het gaat om de gang naar de cloud, is het duidelijk dat niet iedere organisatie even snel gaat, maar ook dat niet iedere organisatie hetzelfde einddoel heeft of zou moeten hebben.

    Ben je een organisatie waarbinnen men vanuit de infrastructuur en dus ook het netwerk denkt bij het denken over veranderingen, dan ben je veel eerder geneigd om positief te zijn over de stellingen. Je denkt dan kort door de bocht dat het middels het juist inregelen van de verschillende interfaces prima mogelijk moet zijn om een groot logisch netwerk te maken, waarbinnen je controle hebt en alles kunt automatiseren.

    Ben je als organisatie vooral druk met persoonlijke data en zijn je applicaties veel belangrijker dan je infrastructuur en je netwerk, dan zal je minder positief zijn. Dat is ook niet meer dan logisch, omdat het dan geen technische exercitie is. De AVG gaat bijvoorbeeld niet of nauwelijks over technologie.

    Ook bij dit type organisatie kan wel degelijk de gang naar de cloud ondernomen worden, maar daarvoor moeten er dan veel meer extra maatregelen genomen worden. Bij een scholengemeenschap zoals Tabijn is dat bijvoorbeeld iets overzichtelijker dan het bij de ANWB is, om maar een dwarsstraat te noemen.

    Wel is het wat ons betreft zo dat je je af kunt vragen of iedere organisatie de 'ultieme droom van volwassenheid' waar Martijn Jonker van Andarr het over had moet willen nastreven. In sommige gevallen zal dit een droom blijven of altijd als een nachtmerrie worden gezien. Laten we verder hopen dat de prestaties tijdens het vliegen in de flight simulator voor sommigen geen voorbode zijn van hoe de transitie naar de cloud zal uitpakken.

    Auteur: Sander Almekinders

    Bron: CIO

  • What to expect in data management? 5 trends  

    What to expect in data management? 5 trends

    We all know the world is changing in profound ways. In the last few years, we’ve seen businesses, teams, and people all adapting — showing incredible resilience to keep moving forward despite the headwinds.  

    To shed some light on what to expect in 2022 and beyond, let’s look at five major trends with regard to data. We’ve been watching these particular data trends since before the pandemic and seen them gain steam across sectors in the post-pandemic world.  

    Trend 1: Accelerated move to the cloud(s) 

    We’ve seen a rush of movement to the cloud in recent years. Organizations are no longer evaluating whether or not cloud data management will help them; they’re evaluating how to do it. They are charting their way to the cloud via cloud data warehouses, cloud data lakes, and cloud data ecosystems.  

    What’s driving the move to the cloud(s)? 

    On-prem hardware comes with steep infrastructure costs: database administrators, data engineering costs, flow sweeps, and management of on-prem infrastructure itself. In a post-pandemic world, that’s all unnecessarily cumbersome. Organizations that move their databases and applications to the cloud reap significant benefits in cost optimization and productivity. 

    What to know about moving to the cloud(s) in 2022: 

    Note that I’m saying “cloud(s)” for a reason: the vast majority of organizations opt for multi-cloud and hybrid cloud solutions. Why? To avoid putting all their data eggs in one cloud basket.  

    While cloud data management services make it easy to move the data to their cloud, they also make it easiest to stay in their cloud — and sometimes downright hard to move data from it. Remember, a cloud vendor is typically aiming to achieve a closed system where you’ll use their products for all your cloud needs. But if you rely on a single provider in that way, a service change or price increase could catch you off-guard.  
     
    To stay flexible, many organizations are using best-fit capabilities of multiple cloud providers; for example, one cloud service for data science and another for applications. Integrating data across a multi-cloud or hybrid ecosystem like this helps organizations maintain the flexibility to manage their data independently.  

    Trend 2: Augmented or automated data management 

    Every organization relies on data — even those without an army of data engineers or data scientists. It’s very important for organizations of any size to be able to implement data management capabilities.  

    According to Gartner, “data integration (49%) and data preparation (37%) are among the top three technologies that organizations would like to automate by the end of 2022." 

    What’s driving the shift to augmented or automated data management? 

    Data management has traditionally taken a lot of manual effort. Data pipelines, especially hand-coded ones, can be brittle. They may break for all kinds of reasons: schema drifts when there are changes between source and target schema; applications that get turned off; databases that go out of sync; or network connectivity problems. Those failures can bring a business to a halt — not to mention that they are time-consuming and expensive to track down and fix.  

    Automating data management also frees up engineering resources. Gartner also says that by 2023, AI-enabled automation in data management and integration will reduce the need for IT specialists by 20%. 

    What to know about data management in 2022: 

    By tapping into data services, even small and under-resourced data teams can implement data management and integration — by automating pipelines, quality, and governance on demand. Automation supports flexible pipeline creation, management, and retirement, granting organizations of any size or stage of growth the data observability they need in a continuous integration, continuous deployment (CICD) environment. 

    Trend 3: Metadata management 

    Since metadata is the glue that holds necessary data management pieces together, it’s no wonder that organizations are aiming to improve their handle on it.  

    As different lines of business develop their own shadow IT, the ecosystem grows in complexity: many companies end up buying multiple solutions and tools and then often need to pay consultants to make them work together.  

    What’s driving interest in metadata management? 

    Business agility is a requirement in today’s chaotic business landscape, which creates enormous demand for analytics. Healthy data is now a must-have for users with varied levels of technical skill. It’s impossible to expect them to become data analysts and engineers overnight in order to find, share, clean, and use the data they need.  

    What to know about metadata management in 2022: 

    Many companies have multiple data integration tools, quality tools, databases, governance tools, and so on. As data ecosystems become increasingly complex, it’s more important than ever that all those tools can speak to each other. Applications must support bi-directional data exchange. According to Gartner, data fabric architecture is key to modernizing data management. It’s the secret sauce that allows people with different skill sets — like data experts in the business and highly skilled developers in IT — to work together to create data solutions. 

    Trend 4: Real-time data access  

    Real-time data is no longer a nice-to-have; it is vital to operations ranging from manufacturing to utilities to retail customer experience. In addition, every company needs operational intelligence.  

    Any time an event is created, you should be able to provide that event in real time to support real-time analytics. 

    What’s driving interest in real-time data access? 

    We haven’t just seen the arrival of the Internet of Things (IoT) and Industrial Internet of Things (IIoT) — businesses are now reliant on them. In a world fueled by real-time data, batch integration and bulk integration are no longer enough to keep up.  

    What to know about real-time data access in 2022: 

    Extract Transfer Load (ETL) has to be supported by other integration styles including streaming data integration to capture event streams from the logs, sensors, and events that power your business. Make sure you’re building an architecture that supports both batch streaming in real time, and also virtual data access such as data replication and change data capture. That way you won’t have to move the data when you don’t want to.   

    Trend 5: Line of business ownership of data 

    Data is no longer tightly controlled in the back end by a central IT or data organization. In more and more businesses, the organization reporting to a CDO or CIO focuses on governance and compliance while business users process data within their own lines of business.  

    What’s driving line of business ownership of data? 

    As data becomes the language of business, we’re seeing the proliferation of citizen data scientists, citizen data integrators, citizen engineers, citizen analysts, and more.  

    What to know about line of business ownership of data in 2022: 

    Low-code and no-code data preparation and self-service data integration tools equip data users on the front end to ingest, prepare, and model the data for their business needs. These new “citizen” data workers are business experts who don’t have a PhD in statistics or engineering. They don’t know R, Python, Scala, Java, C Sharp, or Spark — and they shouldn’t have to. On the other hand, decentralizing data management can create data governance, compliance, and security headaches.  

    As more and more data software sits with the line of business, organizations should look for a data fabric that will enable central data engineering teams to monitor what the data preparation teams prepare. That way, data experts can improve data governance and compliance while lines of business maintain ownership of the data itself.   

    Author: Jamie Fiorda

    Source: Talend

  • Which Technologies to Consider in Your Digital Transformation Strategy

    Which Technologies to Consider in Your Digital Transformation Strategy

    If your enterprise is about to undertake a digital transformation (Dx) project, you should understand that these initiatives require a focus on more than the technology itself. To succeed with a digital transformation strategy, the business must focus on business processes, day-to-day activities and tasks, and the culture within the organization, so that the environment is prepared to support all the changes you will make within the technology infrastructure. 

    Of course, Dx does require a focus on virtual technologies, computing environments, and data storage, such as computing, networks, hardware, software products, and applications, as well as Infrastructure as a Service (IaaS), Platform as a Service (PaaS), Software as a Service (SaaS), and mobile applications used by team members, customers, and stakeholders. In short, today’s technology reach goes far beyond the walls of the enterprise and, when you consider how your team, customers, suppliers, and stakeholders use or connect to your technology, you know you must include all of these components in your Dx strategy. 

    Here are just a few of the components of the technology infrastructure you will need to consider in your digital transformation strategy:

    Public, Private, and Hybrid Cloud Platforms: Data resides in public, private, and hybrid cloud platforms and, as your workflow and business processes are assessed, you will need to include access to and security for these cloud environs, as well as any integration and streamlining your team (or your IT consulting partner) must include in the plan.

    Data Warehouses, Data Hubs, Data Lakes: Data integration can help you streamline activities and make information more accessible. Your data warehouses and repositories must be included in your assessment with appropriate user access controls and data migration and integration strategies. 

    Hardware, Servers, Network: Any and all of these components can impact the success of your digital transformation project. Your technology assessment must review these aspects of your infrastructure and determine whether you will require upgrades, streamlining, or expansion. 

    Software Applications and Products (Legacy, Best-of-Breed, ERP, etc.): When you undertake a digital transformation project, it is a good time to evaluate the software products and apps your team and stakeholders are using and determine whether any of these are ready for upgrade or replacement. Organizations change over time and a familiar software product or app may be popular with users, but perhaps one or more of these is no longer sufficient to meet your requirements. As part of your digital transformation strategy, you should take a hard look at the appropriateness of these tools and plan for replacement, upgrade, or changes. 

    Mobile Applications: Any digital transformation strategic initiative must accommodate mobile apps used by business users, suppliers, contract workers, or other stakeholders. Today’s mobile apps are important to workflow and business processes and must be included in user access, integration, compatibility, and security considerations. 

    IaaS, PaaS, and SaaS Platforms: Your business may have reduced its dependence on on-premises, licensed products and services, and any Dx initiative must accommodate these new environmental dependencies. 

    These are just a few of the technology considerations you will need to include in your digital transformation (Dx) strategies. Perform a thorough and complete assessment of tools, software, networks, and other components of your business technology that may be localized or regional or limited to one business unit. Also, be sure your data transformation (Dx) strategy encompasses other aspects of your business environment like enterprise culture. Don’t leave your users or stakeholders behind! 

    Date: July 5, 2023

    Author: Kartik Patel

    Source: Dataversity

  • Why cloud solutions are the way to go when dealing with global data management

    Why cloud solutions are the way to go when dealing with global data management

    To manage geographically distributed data at scale worldwide, global organizations are turning to cloud and hybrid deployments.

    Enterprises that operate worldwide typically need to manage data both on the local level and globally across all geographies. Local business units and subsidiaries must address region-specific data standards, national regulations, accounting standards, unique customer requirements, and market drivers. At the same time, corporate headquarters must share data broadly and maintain a complete view of performance for the whole multinational enterprise.

    Furthermore, in many multinational firms, data is the business. In worldwide e-commerce, travel services, logistics, and international finance for example. So it behooves each company to have state-of-the-art data management to remain innovative and competitive. These same organizations must also govern data locally and globally to comply with many legislated regulations, privacy policies, security measures, and data standards. Hence, global businesses are facing a long list of new business and technical requirements for modern data management in multinational markets.

    For maximum business value, how do you manage and govern data that resides on multiple premises, clouds, applications, and data platforms (literally) worldwide? Global data management based on cloud and hybrid deployments is how.

    Defining global data management in the cloud

    The distinguishing characteristic of global data management is its ever-broadening scope, which has numerous drivers and consequences:

    Multiple physical premises, each with unique IT systems and data assets. Multinational firms consist of geographically dispersed departments, business units, and subsidiaries that may integrate data with clients and partners. All these entities and their applications generate and use data with varying degrees of data sharing.

    Multiple clouds and cloud-based tools or platforms. In recent years, organizations of all sizes have aggressively modernized and extended their IT portfolios of operational applications. Although on-premises applications will be with us into the foreseeable future, organizations increasingly prefer cloud-based applications, licensed and deployed on the software-as-a-service (SaaS) model. Similarly, when organizations develop their own applications (which is the preferred approach with data-driven use cases, such as data warehousing and analytics), the trend is away from on-premises computing platforms in favor of cloud-based ones from Amazon, Google, Microsoft, and others. Hybrid IT and data management environments result from the mix of systems and data that exist both on premises and in the cloud.

    Extremely diverse data with equally diverse management requirements. Data in global organizations is certainly big, but it is also diverse in terms of its schema, latencies, containers, and domains. The leading driver of data diversity is the arrival of new data sources, including SaaS applications, social media, the Internet of Things (IoT), and recently digitized business functions such as the online supply chain and marketing channels. On the one hand, data is diversifying. On the other hand, global organizations are also diversifying the use cases that demand large volumes of integrated and repurposed data, ranging from advanced analytics to real-time business management.

    Multiple platforms and tools to address diverse global data requirements. Given the diversity of data that global organizations manage, it is impossible to optimize one platform (or a short list of platforms) to meet all data requirements. Diverse data needs diverse data platforms. This is one reason global firms are leaders in adopting new computing platforms (clouds, on-premises clusters) and new data platforms (cloud DBMSs, Hadoop, NoSQL).

    The point of global data management in the cloud

    The right data is captured, stored, processed, and presented in the right way. An eclectic portfolio of data platforms and tools (managing extremely diverse data in support of diverse use cases) can lead to highly complex deployments where multiple platforms must interoperate at scale with high performance. Users embrace the complexity and succeed with it because the eclectic portfolio gives them numerous options for capturing, storing, processing, and presenting data in ways that a smaller and simpler portfolio cannot satisfy.

    Depend on the cloud to achieve the key goals of global data management. For example, global data can scale via unlimited cloud storage, which is a key data requirement for multinational firms and other very large organizations with terabyte- and petabyte-scale data assets. Similarly, clouds are known to assure high performance via elastic resource management; adopting a uniform cloud infrastructure worldwide can help create consistent performance for most users and applications across geographies. In addition, global organizations tell TDWI that they consider the cloud a 'neutral Switzerland' that sets proper expectations for shared data assets and open access. This, in turn, fosters the intraenterprise and interenterprise communication and collaboration that global organizations require for daily operations and innovation.

    Cloud has general benefits that contribute to global data management. Regardless of how global your organization is, it can benefit from the low administrative costs of a cloud platform due to the minimal system integration, capacity planning, and performance tweaking required of cloud deployments. Similarly, a cloud platform alleviates the need for capital spending, so up-front investments are not an impediment to entry. Furthermore, most public cloud providers have an established track record for security, data protection, and high availability as well as support for microservices and managed services.

    Strive to thrive, not merely survive. Let’s not forget the obvious. Where data exists, it must be managed properly in the context of specific business processes. In other words, global organizations have little choice but to step up to the scale, speed, diversity, complexity, and sophistication of global data management. Likewise, cloud is an obvious and viable platform for achieving these demanding goals. Even so, global data management should not be about merely surviving global data. It should also be about thriving as a global organization by leveraging global data for innovative use cases in analytics, operations, compliance, and communications across organizational boundaries.

    Author: Philip Russom

    Source: TDWI

  • Why You Should Be Securing Big Data In The Cloud?

    160104-Cloud-800x445Combining big data and the cloud is the perfect solution for a company's computing needs. A company's data often requires a computing environment which can quickly and effectively grow and flex, automatically accommodating large amounts of data. The cloud computing environment does just that. There is one question which continually arises when discussing cloud computing.

    How secure is the cloud?

    Securing data, especially big data is a major concern. Companies expect that any data stored in the cloud will be secured and that the security measures will be flexible to keep up with a changing threat environment. There are four ways to keep your big data secure in the cloud. Each will keep your data safe and yet provide the flexibility that is inherent to using the cloud.

    1. Encrypt Your Sensitive Data

    Encrypting your data provides another level of security within your cloud infrastructure. Each security solution must be customized to the project and the data. There is no single type of encryption that will work for every situation. Certain types of premise gateway encryption solutions do not work well with cloud big data situations. Other solutions, including encryption provided by the cloud provider, ask the end user to rely on someone else to encrypt their data. This is often a risky proposition and most companies will not agree.

    Encryption solutions such as split-key encryption were developed specifically for data storage in the cloud. These technologies keep the cloud data safe, providing encryption keys which the customer holds and uses.

    Split-key encryption is the safest and most effective means of encrypting cloud-based data.

    2. Use cloud security technologies which can be scaled to meet changing requirements.

    When it comes to big data, solutions must quickly scale to meet the demand. This is the same for security technologies for cloud data. Ensure any cloud security technology you choose is available and relevant across any and all cloud locations. Additionally, to be effective, any cloud security solution must be able to quickly scale to meet demands and changing requirements.

    Because of the inability to quickly scale and grow, hardware solutions are not a viable option for securing cloud big data. It is not possible to adapt a hardware security module (HSM) quickly enough to meet continuously changing data security requirements.

    Only a cloud-based solution will provide the ease and efficiency to scale quickly in response to demand. These solutions are just as, if not more effective than hardware-based technologies. Additionally, cloud based solutions such as CDNs provide security and are also allow for faster working sites and tools.

    3. Automate where possible

    Many companies are not happy with traditional cloud security solutions because they will not scale quickly to meet demand. Standard encryption technologies typically use an HSM element in their design. Since hardware cannot be automated, these security solutions are limited in their effectiveness within the cloud.

    The best cloud security solutions use virtual appliances instead of hardware within their systems. It is also important to ensure an effective RESTful API is part of any cloud security solution.

    A cloud security solution which includes a RESTful API and a virtual appliance will provide the automation and flexibility required to secure cloud big data.

    4. Never compromise on data security

    Cloud big data security solutions are often a complicated business. As a result, we often see systems which are not quite as comprehensive as they should be. Some cloud security systems designers will take a shortcut to get around the complexities involved in securing big data.

    For example, some systems will use freeware and their encryption tools to secure the data. They may keep the encryption keys in a physical location or on a disc which provides an opportunity for them to be lost or stolen. Using these types of shortcuts can certainly be easier, but they do not provide a viable security solution for cloud data storage.

    Companies must protect their data by mapping and reviewing the sensitivity of their data and then designing a cloud security solution to ensure it is protected. The right security solution can provide great results. Remember, not all cloud data storage is secure. If you have very sensitive or regulated data, you may need to search for another security solution to keep your data protected.

    Source: SmartDataCollective

  • Why you should implement automation in your business

    Why you should implement automation in your business

    When automation is done well, it accomplishes more than just saving time and money. It minimizes errors, improves productivity, increases employee satisfaction, and enhances the customer experience. When incorporated into a business strategy, employees get more done, in the same amount of time, allowing them to focus on the important objectives of their role.

    While automation may not be the latest advancement in technology, it will have the greatest impact on how we do business over the next decade. IT managers who fail to employ automation will likely lose their competitive advantage. Gartner estimates a 25-percent reduction in customer retention over the next year for companies that choose not to incorporateautomation into their business strategy.

    What is automation and why do it?

    Automation enables the workflow to proceed without human oversight. Automation can be deployed in place of traditional manual systems such as entering purchase orders, customer service, data analysis, and reporting. Eventually, nearly all IT teams will automate some aspects of their businesses. As businesses grow, automation will expand customer service without increasing the number of employees. Successful automation enables your existing teams to manage additional customers with the same speed of service. Simply put, automation allows the company to accomplish more with less.

    Automation benefits both the company and its customers. Customers report an improved experience due to better consistency in order fulfilment, faster response times, and lowered costs. Improved customer experience will improve brand loyalty and increase customer lifetime value. Automation empowers companies to optimize the way they allocate internal resources to save money, and take advantage of new opportunities to increase sales. In other words, businesses are either saving or earning money when they automate.

    Reporting

    Automated reporting comes as part of the package with BI solutions. Users can access relevant and timely data on how the business is performing across all domains. By instantly converting raw data into actionable information, automated reporting eliminates the challenges associated with traditional forms of reporting. Now users can see what has happened, what is happening, and what is likely to happen in the future.

    Reports can be generated automatically at set times, such as every monday morning for the weekly sales meeting. Reports also may be triggered by certain events, such as when sales figures fall within a certain range. Self-service analytics also provides users with a customizable dashboard for on-demand reporting based on job role. A dashboard allows users to see what is happening in real-time, and to drill down into the details to see the root cause of problems, as well as to identify new trends and opportunities. From sales teams to inventory managers, users have access to up-to-the-moment data from anywhere, on any device.

    Security

    The cloud offers cost savings as well as added security benefits. IT managers and CTOs work with the SaaS providers to determine the level of access to be provided to users in their business. They can determine when devices should be able to access resources and restrict permissions to users based on their job roles. Security should be a priority. Both in private cloud and dedicated SaaS, it is important to manage and minimize data breaches the best way possible. To ensure the ongoing security of customer data, independent regular vulnerability and penetration testing and having a security incident response policy in place is recommended.

    When companies embrace automation, employees have time to work on items that add genuine value to the business, allowing them to be more innovative and increase levels of motivation. Customers also benefit from improved service and experience.

    Source: Phocas Software

EasyTagCloud v2.8