Security, and in particular cyber security, is one of the main items at the forefront of priorities at most companies, with the focus primarily being on preventing unauthorised exploitation of systems, networks, and technologies. There is a saying going around about a breach in security – “It’s not if, but when…”.
Even though there are extremely sophisticated and comprehensive solutions in the market to help combat unauthorised exploitation of systems, networks and technologies, there are a few basic information management principals/practices that many customers overlook or neglect that could aid in the making it much harder for external and internal bad actors to access, steal or hold data to ransom.
Information is held in data. It is the most precious asset of any company, and the majority of it consists of unstructured data residing across various platforms. This is where most of the information is kept that needs protecting. Most companies experience exponential data growth, putting a tremendous strain on the systems hosting and protecting it, as well as the people and processes responsible for managing it. Most unstructured data is considered to be “dark data” – the business does not know how valuable or sensitive the data is, who exactly has access to it, or how old/stale the data is. In many instances the exact location of the dark data is unknown.
Hard to reduce risk if you can’t see.
A least privileged data access model is exactly what the name suggests – provide users with access to only the data they need to have access to as part of performing their work. Most assessments performed against platforms hosting unstructured data like Windows File servers, SharePoint and OneDrive for Business provide alarming results on how many files are unnecessarily exposed to and accessible by users by way of global access group membership. This type of “by default access” is unnecessarily, dangerous, and extremely hard to audit across all the data repositories, and even harder to remediate without affecting end users and causing disruption to the business.
It must be stressed that insider threats are a real concern, consisting of either legitimate employees utilising their account credentials to access data they are not supposed to or bad actors compromising of hijacking those account to access data.
Global access typically goes hand in hand with broken Access Control Lists (ACLs) which prevents correct inheritance of permissions to flow down the directory structures. Utilising native tools to report on, and then repair broken ACLs and remediate open access can consume hundreds, if not thousands of hours. There is also no easy way to get instant reporting on determining the initial scope or determining the progress during repair and remediation activities have been actioned.
Wouldn’t it be useful to also know, for example, who has accessed the data via the global groups in the last 90 days?
Luckily there are products that specialise in providing answers – from providing near real time access audit log reporting, to automating these types of repairs and remediations without affecting end users at any stage of the process. Products like Varonis Data Security Platform is one such a product that provides all these features across all platforms hosting data and performs “auto healing” of unnecessary data exposure by way of its sophisticated machine learning.
Remediating global access on all data platforms is a big first step towards securing data and complying to a least privilege data access model. It is also an essential step in limiting the blast radius caused by exploitation.
But where to start? Wouldn’t it be great if the risk can be associated with the sensitivity of the data?
Due to the nature, size, and locality of data in most organisations, the value, as well as risk associated with majority of data is unknown to the business.
Is all the Human Resources information contained to only the HR folder on the File Server’s D:\DATA\HR folder? Is all patient information exclusively stored in the patient information folder in the SharePoint site? Do users have scanned files saved in their OneDrive folders containing credit card information?
Classification of data is an essential next step to gain visibility into the data, pinpoint sensitive information and focus remediation of security.
But is classification enough?
Various platforms that perform classification hold this information internal to the platform, with end users or other applications and systems unaware of it. Applying sensitivity labels to the data according to classification provides a mechanism of altering the metadata of the files, thus providing visibility to users and to other applications or systems. Data Loss Prevention and protection can now be incorporated to provide for more sophisticated and targeted security. More focus on this in my next session.
Last line of defence!
Since the start of digital data creation, Backup and Recovery has been one of the main pillars of the Information Management Lifecycle of data, providing for the ultimate protection against data loss. It has always been an insurance policy to protect against faulty equipment (disk or tape), environmental disasters (power, floods etc.) or errors and mistakes made by users.
Backups are not technically a security measure but is the essential recovery mechanism to securing data and provide for data restoration with minimum downtime (Recovery Time Objective) and without the loss of large amounts of data (Recovery Point Objective) when breaches have occurred.
Unfortunately, Backup and Recovery is not treated with the “respect” it deserves any more. There are various reasons that contribute to this. Backups are considered boring, backup solutions have complex and outdated interfaces and resemble programs from yester year, backup responsibilities are often passed on to junior IT staff with little experience, training, or interest.
The quality and redundancy of hardware has greatly increased, reducing the requirement to perform regular or comprehensive system restores, as well as clever hardware/software snapshot features providing end users with the ability to a “self-help” solution for recovering individual files on demand when needed. These are but a few examples that have led many companies to ignore regular recovery testing.
To complicate and dilute backup and recovery even more, many organisations also mistakenly consider the backup of data the same as the archiving of data. This generally keeps adding data to the backup platform – more and more data is backed up, and retention periods are extended to unnecessary large timeframes to meet the misplaced “archiving period”. The eventual result is that the backup solution gets overwhelmed which inevitably results in low backup success rates and gaps in recovery options. Declining budgets and priorities are also not aiding in keeping backup and recovery solutions up to date with the latest best practice architecture and product features including hardened backup repositories, swapping tape for deduplication technologies, and keeping secondary copies offsite and air gapped.
All these factors have contributed to provide organisations with a false sense of security about the state of their backups and how affective and complete recovery would be when called upon in an emergency. People tend to forget that if all the fancy security solutions fail and the data gets compromised or held to ransom, backups are the last resort for recovery. Ransomware is an imminent threat to all organisations, and the bad news is that the focus has recently shifted to target backup systems first before directly targeting the data.
Visibility of data accessibility and data sensitivity provides organisations with the information required to remediate over exposure, comply to a least privilege data access model and provides for better security and backup protection of valuable information. Talk to Intalock and have us provide you with the expertise to help safeguard your most valuable asset in your company before the inevitable happens – It is not if, but when!