Saturday, October 15, 2016

GDPR Part IV - Align with your digital journey

So,

we have storage and we have data and we have the users but how the GDPR and Digital links to gether?

Easily, the answer is quite simple in pragmatic way. In digitalization, users will have tools based on their role in the organization as explain earlier. Sales person uses CRM and R&D persona use PLM and so on.

So the future system should support couple of things from wide angle. (just believe there much more..)

1. Automate the user process, make the work easier with the tools witch support the work and not the system. So what the automation do, is that something is done always same way and it can be monitored and explained what has happened. Workflow is great example. When user tag the file for draft version or final in meta data, there can be two different workflows. Rule what will automatically delete the draft file after 6 months and second rule witch requires to way approval to move the document to secure data repository or asset place, Idea is that it is searchable and available for others witch actually explain couple key thing.

2. Meta Data and Classification. Legacy storage and file servers does not support in sophisticated and required level these and usually file server is just file server. Lots of data, weird folder structure, maybe some security related issues, data in wrong place and so on. You just name those.

In 201x what is the demand for file server in central data centers and or in branch offices while everyone are speaking from cloud and and access from anywhere. And compared  on premise solution to certified cloud solution or services, good question is that are we compliance, how our data center services will be analyzed against laws and regulations. Are we able to show reliable audit trail what has happen, who has done and what and when this has happened and unfortunately how long ago this has actually happened.

3. Workflow - Like Meta Data and classification one valuable thing is that we automate simple thing with workflows or software robot's, make thing this interested. When we try to define the workflow we are going more to process and how people work witch if awesome.

Understand how we would work can be only understood by doing it or interviewing people who know it. But there is also other side, resistance for change and staying in as is. That's where there is a need for people who through interviews understand the as is but using their knowledge from emergency technologies and digitization brings the value and ideas how this manual work can be automated, what kinf of value it creates and how the benefits can be achieved and realized - without talking anything from the products.

4. Process - as described earlier one key thing is to know how we work and also see who we should and would work and behave in the future.


5 Audit - this is nice Pandora's box witch has multiple layers and views and here we open just small part of it, in anykind of order.

5.1 User Identity starting from the process how the users are created who manage personal data covering all from Identity point of view, create, modify, delete, communicate... Not to forgot the privileged accounts and group membership management.

5.2 Authentication / login and the audit trail or detect and control. Nevertheless if the data is in the on premise or cloud, one question is how you can detect abnormal behavior. From the user profile or personalization view we know that Jack works from 7-16 sitting in the office and having desktop on the table. This is the place where he sit and try to sell products to the customer in CRM solution with Personal data included from the users. What if Jack is logging from other site of the world or Jasmine is trying to log on from Jack's PC at 03:45.

What kind of governance, process, logging and communication and action capabilities are in the organization.

Brilliant example the Finnish Police in the news where they have internally detected Police Officer to read data from one dead skier Mika Myllylä years ago. Those police were not involved to that case anyhow and not even located and lived in the same municipality (not fully sure from this), so from their daily work - no demand to look this case information. Now they were in the news when the jury was made the decisions with quite light penalty. News did not mentioned how the detect this illegal data breach but key here is that the was a control in place. We can always ask that can those person work as a police any more because they might also look the data from their neighbors and use the power given by their work wrongly.

As said - from the audit trail point of view organization should have good logging systems with proactive capabilities and governance how to proceed when abnormal behavior detected.

But - again - this also creates another data repository for personal data when we start to think this from end to end.

5.3 Logs and how to handle those. I have earlier mentioned from the Proxy w/o authentication and key here is do we understand if there is personal data in any logs from client through network to application / business logic / data base layers and identity and authentication. If said Yes then there is a demand to think a little bit more how to work with this, Can we remove all logs - and if yes did we miss data and broke the end to end audit trail. So we need to deploy a solution how we archive logs and ensure that only authorized person have access to those. And also have technical solution protecting from example data deletion to hide other criminal actions.

5.4 Data - this is wider while due integration and connection it is more difficult to say how apps will connect and how the handle the data cross applications.  One sure thing is that together with the proactive authentication monitor we also must monitor user usage of data regardless if data is stored on managed or unmanaged way.
So we need to log all access both failed and successful and what I have observed that this is not in acceptable level.  Also discussed why monitoring the failed is not enough while this does not collect the data if user with technical rights allow's the access but the user should not use that data source based on his role in the organization. Good example is if normal IT person can have access to Financial or and POS systems using his basic work account.

So logging is required to see if security breach has happened but is that enough - No unless you have process and person to look through all logs. There are also tools and services witch can be used witch actually uses machine learning to recognize the abnormal behavior.

5. Network - Network is in change but it does not reduces it's criticality, instead the traffic will be more from client SaaS services through internet instead of internal MPLS network. This turn around the whole network mindset, where we are protecting our internal network where all the services are to more how we connect and where we connect and also how to reduce the cost. Key question here is how and from where the users connect, where their services are hosted on and more going to cloud, more we need to thing name services and how to ensure the fastest connection to the client. As said 'geoDNS will have huge impact to the overall DNS architecture where your client have had the DNS IP address pointing to the Active Directory DNS witch has then forwarded the DNS query to corporates public name server in one location. Meaning that user from US Ottawa will use the public DNS in Helsinki and all responses given back to client will point Europe. Example for this is Office 365 where in this case client will get the IP address of MS service from Dublin and will connecty trough internal MPLS network to IBO in Europe. This is nothing wrong but if client in Ottawa could use geodns and get local MS service IP address and have local IBO the client will connect to the nearest Microsoft front end Exchange and then the data from Dublin will be manage by Microsoft witch release all email traffic from MPLS to the Internet.

Other example is VPN usage, when services goes more and more to cloud and spread all over the world the tight security and VPN usage start to be an issues. Now encrypted traffic will be routed and encrypted in VPN through on premise gate witch can mean that user located in Hotel in Dublin must take VPN tunnel to US concentrator witch actually then routed the traffic back in MPLS to Europe and IBO in Helsinki or from US to Internet - not cool at all.

So moving from legacy to SaaS and digitalized the work, might and usually have impact also to network, routing, network services and make the transition even more complex and unfortunately costly too. Moving from 3 regional IBO's to country based IBO's with geodns is not one day activity, instead requires hard design and implementation where organization own capabilities and their service providers capabilities are the key.

But time to go and all good so far.

To Be Continued...

"All opinion, thoughts and pictures are my own"
Car Show Lahti/Finland October 2016


No comments:

Post a Comment