Showing posts with label Finance. Show all posts
Showing posts with label Finance. Show all posts

Friday, November 11, 2016

Save to death or enabler for Digitalization PART i

Save to death or enabler for digitalization is the questions every organization should ask them self when constricting IT budget.



Without Business there is no IT but also without IT you cannot do business anymore and some years ago the change happens where the IT is enabler and not only costs as seen so earlier but why did we understood that.

Digitalization and getting the benefits out from the existing or new information cannot be done without investment witch actually restrict the organization to value from information and data. Pragmatic we can reduce our cost, we can optimize our work or we can create new business but why it is so difficult to make decisions witch reduce the cost, influence to our end user performance and might create new value.

Example, what has been the benefit for automating car manufacturing with robots's. I would say huge, and background to realize the benefits has been the decisions to make investment, spent money to build capabilities or reduce cost. Quite simple everyone can agreed - hopefully. To get minerals you need to have tools to dig - invest / spent money to get something.

Are we still in the mindset that business leadership does not understand the benefit of the data (to saved and deleted) and change in the end user behavior - with prioritizing of course.  Data witch has no positive value and not deleted turns to high risk of negative value through the EU GDPR requirements to prove and demonstrate the data usage or not. 


Some smarter has said that success starts from our own employees, when they are happy, your customers are happy and share that on the feedback, it is easier to get more business with good feedback witch means higher turnover with should mean better revenue and profit.

So we came back to basic questions, how we will offer the best tools, communication and collaboration setup witch fits best based on the user role and responsibilities regardless what device they use. 
How we made the accurate information available for the users instead of making available bunch of unvaluable data.

So where the data is stored. Application with data bases, network shares, local PC and cloud both consumer and business version.

Example how the data can be a risks regardless how it will be seen.

  1.  End user use dropbox or OneDrive at his home PC saving legal and illegal data to PC
  2. Cloud service will syncronize it to cloud witch is the reason to use those services
  3. User configure the OneDrive or DropBox to work PC syncing the personal data from cloud to on premise
  4. This folder might then be copied with scripts to corporate network share and backup systems OR folder is synced to OneDrive for Business to corporate Office 365 tenant.
So we can clearly see the security risk and risk for saving personal data - both legal and illegal - in corporate storage from where the organization does not have any information. From GDPR point of view and IF something happens, organization must go through these file to confirm that there is no personal data to to data subject who has made the request or who is part of the security breach. All these equals to cost and is a risk witch should be identified and discussed.

Users are smart and they try to find the way how they can work if organization does not allow for example way how to share data with partners and here the risks is that the corporate data will be synchronized via cloud services to their home PC witch should not allowed in any how.

And coming back to basic question, what we have done to protect us from this, what we have done to communicate with the end user what is allowed and what not, how we try to ensure that we don't have data in wrong place and keep unvaluable data in our storage. Normally the hardware upgrade means pragmatically to copy the same unknown data to new storage every 5 years. 

So what can be done to get the data visible and available, reduce the costs and automate the data archive and retention and does it make any value for us. I would say absolutely Yes, the legacy data can be available from one common connection point like SharePoint even if the data is stored to Azure blob storage. This is one example how to use Azure PaaS and Storage in the background. Moving the data to cloud also should have change impact to network connectivity, why to use more expensive MPLS connection while data is available from public cloud. 

What we have seen is the change in the small offices where the local file server and MPLS connection has been changed cloud storage and xDSL connection with Site to Site VPN solution through internet to corporate regional or main data center to allow the access to internal business application -witch actually can also be published through Azure services and fits nicely to web services.

Everything is related to everything

I recommend you to read the Avanade's blog by Wictor Wilen from Sweden, while it really works as a eye opener in many ways.

From Baby Boomers to Generation 2


It explain how different generations works and familiar with, witch actually opens great opportunity to change management for service adoption and penetration.



Second topic it brings is the idea what to use and when BUT here one key is the information not the technology.


As we can see Wictor has wider view and perspective to what tools used and where and linking it to the generational preferences and Priority/time sensity.

This is awesome and started to work when deployed BUT what happens to old legacy solution and data witch is the issue. Too often the transition project does not include funding to move the data from legacy to new to help organization to benefit from the investments.  End user experience cannot be seen very great if they have data what they use daily in File Servers, SharePoint 2007, SharePoint 2010 and SharePoint Online witch correlate to save to death but the root cause is deeper. It's the mindset to save all and it has been the way of working from the history together with open policy allowing users to save everything to everything and parallel to bemoan while not find the data or it's not accuracy and cannot find people who knows from the topic she/he is looking for.

Using monthly, subscription based services one key is the user profiling, so to add one additional layer to picture above is to profile users. What they services they use now, what services they will use in the future and then map to right monthly service subscriptions.

Secondly we can also mirror the information value compared from individual to organization and there understanding the usage of information means a lot. Different departments handle different type of information with own requirement to save the data. Finance has it own laws describing how long data must be saved where R&D must save technical data with it's rules. There is mandatory data witch can or cannot be deleted but there is huge amount of draft and temporary data with no need to save but still saved. This is creates the snowball effect and data berg (Veritas) like iceberg - only small part of the data is known and there is more data from where organization does not have understanding.
 
If you noticed, today's workplace also have bot's helping the user daily work witch means increased performance witch equals better productivity witch means better profit - simple or not?

So what to do if saved to death? Start to get together and agreed that now we need to use the saved money from earlier years - there is no free cheese like Rainmakers say..


So make a decisions, get the funding's and start to work, there is no short cut - unfortunately except bankruptcy if administrative fines set while not done anything.

So everything is related to everything but also the greatest benefits come from bigger change in both technical, process and user experience view. Still it is only work and mostly the technology is there, some available from OOB in latest Windows Server version or through commercial applications like Veritas Data Insight and Enterprise Vault for archiving with multiple storage option like SharePoint Online, Microsoft StorSimple with Azure, Azure Blob Storage and local SAN and storage system.

Weekly theme: Less Data is smaller risk

All thoughts and are my own

Following pictures shows what can be achieved with hard work witch also works for this article.

American Car Show Helsinki, Finland 2016 "Make it shine"

Thursday, October 20, 2016

Remember and learned something from year 2000 change --> Be smarter with EU GDPR

From year 2000 IT change to next big change - EU GDPR

How many of us remember the IT workload when we came from history 1900 to new and magnificent 2000 century and Windows 2000 Active Directory and changes in the application and so worth - was great time for trainers. Running from customer to customer and keeping Microsoft MOC 1560 NT 4 to Windows 2000 Upgrade and our own Windows Server 2000 courses.

That was the history part and lets keep it there - but this brings at least to my mind how we have prepared to May 25th 2018 when the EU GDRP comes in to effect.

The answers varies, some organization does not even know from this and others have started to prepare and others are between and or does not do anything.

Other questions is the possibility to use the law wrong way where criminal organization start to recruit people or asks people to go from one organization to another ask's them to show what they have from them and please forgot me.

Sound's like DDOS Denial-of-Service-Attack's and all based on law.

How many requests one organization can carry and use resource for this? How many resource and FTE's time they can spend to go through the request and understand if there is any risks that yes there might be some personal data from that user user who made the request.

What if I will pay 5$ per user to make the query to any global services offering organization like eBay, Amazon, Microsoft, Netflix - you know what I mean. Let's assume that someone invest 10 000$ and get 2000 users to make the request to three company Amazon, eBay and Netflix and find data from them. Based on the law they need to analyze what they have from those user's and it might be nothing - those users has not even create account to any of those services but the labour cost to do the analysis is somewhere. Then investing another 10 000$ for the same user to register and create account to those services and make the request to be forgotten after registration. Can they say that we dont have any data from your while we did the check earlier and based on law -- wrong answer. They need to do the analysis again witch equal to labour cost, time away from more productive and so on.  Then after being attacked this way too many times they answer to next user that no way, go away - we are not going to analyze you and forgot you - but this time the request like forgot dead relative

This is only an illustrative example where organizations needs to prepare and the question stays still - are they ready 25th of May 2018 unless the law.... who know's


Nevertheless, this brings to quite major topics on the table:
1. Do we know data we have and where?
2. Is our applications supporting this, application done and published earlier and/or applications in development phase?

Let's take other illustrative example using Facebook as an example. Your husband or wife has been active in Facebook and suddenly he or she died and husband or wife has to fight to get the account and profile deleted (witch opens other question of groups he/she has created and is the owner for it - what happens to those groups and data insight them witch mostly might be personal data - but that is another story and let's go back to org example) and finally got confirmation from it. Then, suddenly something happens to at the Facebook service and they are forced to restore the data - what happens then?

I don't know - sorry.

But I can bet that for relative's, it's not fun at all to see the wife or husband back and active.

This is just and example based on no knowledge how Facebook works to avoid this kind of situations but again bring the same questions to my mind - Are they ready?

So if go back to the key questions and start to think those two and start from knowledge. Structured data is easier (or not) to understand and know what we have while it is usually in database and we know where the application and database is used - correct?. The opposite - unstructured data is or should be big head ache to organizations business, risk management, security and IT.  While this is not related only to data, it's not enough that you analyze what you have in your collaboration tools and file shares but it goes also to identity and now talking privileged accounts running applications witch might user file shares as part of the small, legacy applications, do you know those, when the password has changed last time, do you have any detect and control process in place.........
The world legacy and history have huge weight here where the data has been migrated from upgraded storage during years without deleting - usually - anything.

It's payback time -unfortunately. Same way that organization using legacy Notes mail and application has explain that they saved them to bankruptcy while staying in the same license and hardware too long - it was good idea and the cheapest in short term but there is no free cheese.
It cannot be expected that if we stay in the same version, others will also do, and that there is no influence in today's social world to the brand where people to share the they are using old tools and techics in their daily work. To days digi native will vote with their feet and we can bet that their social friends will now the reason per yesterday.

Let's get back again to the unknown unstructured data and what it is. So data migrated between years from old to one without deleting together with growing data trends and user behavior. Traditional file shares does not have - usually - data classification, index and search capabilities, versioning, available from mobile - you know and can name those - there is not unless you have purchased 3rd party like Veritas Enterprise Vault archiving tools, tool you have tried to get way during email migration to the Exchange Online as good example.

So you find anykind of, age, usage and amount of data, from where you might use and know about 10-20% and other data is just storage cost - and now we are back in business - Euros, Dollars, Pesetas, Ruplas  you name it  - Money talks and in here with small example.

Assumptions:
  • 24000 users
  • 50 GB average disk quota per user
  • 20% active and valuable data
  • 3,5$ / GigaByte the managed storage cost (can be from 2-5$ per gigabyte)
Calculation (24 000*50GB*3,5$)=4 200 000,00 dollars per year - not bad.

But lets calculate what is the Dark Data size and price - so the data without any values (note here that even if old data it can be valuable like old product drawnings, contracts and so) 24 000*50*0,8=960 000 GB = 3 360 000,00 dollars for nothing. For me it sounds quite good business case.

As said, theoretically it is easy show the business case but this really requires more analysis while the 80% of total storage usually includes installation medias, backups, virtual machines disks, .ISO images, zip files and so on and today and even more in the future movies and audios files and of course including unknown amount of duplicates.


So if look back to title and year 2000 there are some common like
  • Yes, it impacts to whole organization
  • Yes, it requires change
  • Yes, it requires finance or you take the risk of penalties. Recommend to read with your risk organization together with business, legal, security and IT.
  • Yes, it include your directory services
  • Yes you need end user training and communication
If done correctly with right partner you might achieve benefits like
  • Yes, you can sleep your nights
  • Yes, this is the time and place to upgrade and adopt governance and start to monitor
  • Yes, you increase your security
  • Yes, you make your or increase your data's value
  • Yes, the data in static file server should be available from any device and any time and any where (who remember this from Microsoft and who and when?)
  • Yes, might reduce your storage cost
  • Yes, this time to create and adopt workflows and retention polices to start know the data and let automation to take care unvaluable data
  • Yes, your outsourcing contracts will safe you
  • Yes and No - you might need to run project to change the partner or hosting provider to sleep your nights
  • Your data is available
  • Better user experience and work performance while data can be found.
  • Yes, you stop the snowball effect where the situation creates exeption witch creates exeption witch makes everything more complex, increase the security risk and time consuming equals to money, frankly.
  • and much much more.

But this was today's story.

Todays picture brings the summer, sun and hot roads.. Feel it.
Shortly - I would

"All comments, thoughts and pictures are my own and I don't have legal background"

Wednesday, October 12, 2016

GDPR III and beyond

Thinking is good but also painful in IT - or is it?

Should we start from bottom up or top down when thinking of GDPR ==> Information ==> DATA ==> and finally from storage where the data has been saved in history and will be saved in future and hopefully with retention policies and archiving.

If we go back to root and ask why we have storage the answer is should be clear - we want to do business and without business there is no process to create data and demand for strorage. This should be clear for all but when we take some perspective and look outside the IT might still define what is the storage architecture used for everything and it has worked earlier but today, it's not so obvious anymore and IT need to discuss more with business to understand it's demand and how IT can bring new ideas and be enabler rather than ongoing cost.

What if we turn the idea upside down and start to think what kinf of profile we have in the organization like:
  • Finance
  • HR
  • Sales
  • Communication
  • Training 
  • IT
  • R&D and product development
  • Procurement
  • ...departments...
- where each organization silo or department or business units uses their own application and creates data in their required format witch can be totally separate what other units will manage. Parallel the change from on premise to public cloud and SaaS services has spread the corporate's data - not only one data center any more with full control -  to multiple location regardeless if it has been beyond own IT's capabiliteis to offer and deliver required services or business decission.

Nevertheless the data mass is growing, it's format is extend from traditional static file to audio and video files and formats but who actually design where these should stored, how they should be available for the end users and how long and to whom and how long they are valid. Sounds like it has something to do with governance, policies, meta data blah blah blaah, and still the questions where to save these data and have clear, measurable benefits from them....

I asked my self multiple time how the GDPR and this topics mirrors to storage and answer might not at all and sametime from everywhere depending of the data content, does it include personal data, is it business data and is it valuable business data, is final version or draft version, is it searchable, is there  retention polices to delete information and data when there is no legal reason to save the data anymore with the question what is the value of the data.

And liked or not,  we come back to the basic questions of who owns the data and who creates data / information. Where the answer is Business and Sales Person managing opportunties in CRM and sending approved Proposals to the customer based on RFP as an example. If we simplify even more and start to split the tasks smaller and smaller part to understand what kind of information is handled in RFP response, we can quite easily find to type of data; structured data (customer information like anddress, contact persons, sales activities calls and emails, campaings...) managed in CRM solution and unsturctured witch is actually the result or deliverable - The Proposal.

Sounds simple, it can be or not depending of the business and the size of it.

Let's open the RFP response process. Sales person creates new  opportunity to the CRM, maybe with workflows to get intenal approvals to even start to work and staff resources and create the BID team.

The Bid team is like small project where the BID manager is responsible from the schedule and deliverables witch are usually printed or electric documents based on RFP requirements. The work requires experts and SME's able to create the solution, estimate the solution workloads and components, estimate the schedule, create the finance, define what is in the scope and out of scope, what are assumption. All these needs to be, usually, approved by business that yes this is what we want to sell and is what customer is asking in the RFP, by delivery - yes we delivery this in the presented time windows and resorces, by finance - yes all the financial like FX's, invoicing cycle, Internal fundings are align and by legal - yes from legal point of view we are OK.

Simple, but thinking this small project and data created, itj's not only the customer RFP answer file, instead it is bunch of excels, drawnings, technical documents just name it. Now we can ask a questions from us, where are we going to store these files, and sorry but even before that aks that how would we work and manage versions, how we share files, how each person now what is the current version, what are additional material and what if we need restore some part we already deleted. If we make this even more complex that all the BID team resources does not work in the same office, it will increse the internal cost to get people to work in same place unless....

OMG, question again. How good our current CRM solution supports collaboration and communication during the BID work?
  • Brilliant - all the way all communication and collaboration features available from one application / service
  • Good - some minor issues like lack of  IM, Share or comment or review features as an example 
  • None - we can manage customer and upload document, send emails from client with preformated emails but our CRM role is for Customer relationship  and sales activities but not creating document.

What was your answer?

Same way when you are using online web shop to buy a book, the system does not include writer and printing press or forklifts moving the boxes, it includes only the customer data and the sales items and orders ==> The end product it self, not draft, not forklifts, not paper, not ink ==> the end product.

So based earlier user profile we can identify data stored to two different location based on the nature or the information:
  • Managed Data - Data in CRM system (Microsoft Dynamics 365, SalesForce (might be some others too :-)) like customer name, address, contact persons, opportunties, status of those, value of the opportunity, signed proposals send to the customer and hopefully signed contracs too with terms and conditions
  • Unmanaged Data - Data saved in SharePoint Online (are there other competitive solution available with end to end integration to communication and analytics...) like Word, Excel, PowerPoint, Visio, AutoCAD and other files with version history (major-minor), meta data and data classification, workflows and so on not forgetting the search capability. We have offer this kind of services or product earlier lets find cases and use copy and past the reduce the time for proposal and parallel to benchmark the price. And these also with offline capabilities with automatic syncronization and sharing capabilities.
  •  
Based on earlier we start to talk about digital workplace and digital work where user can work from anywhere, use any devide and like approve the final version using phone or tablet, edit the same document version at the same from different location - All features not usually available from CRM.

As said earlier each business units has different demands and while thinking to upgrade the infrastructure one good thing is to analyze each application and service of how they use storage, what requirement they have from infrastructure based on user demands. Summarize those to undestand the big picture and then with innovation and digital on mind start to find the solution. Even that it might have bigger impact than moving data from old storage to new without any change.

As said starting from business view, moving to user profiles and understanding they daily work and information they need or create is pragmatically quite valuable and should drive the future roadmap to digital workplace.

Still keeping mind the lesson from my grandpa - the poor can not afford to buy cheap - meaning you have to buy two - first the cheap and then the more expensive.

To be Continued.....

"All opinions are my own"