Databases Need Continuous Monitoring & Proper Data Stewardship

file 39 security-services
While perimeter, cloud and mobile security tend to grab the headlines, in reality it’s the database repositories and the private financial information stored in databases that are the actual targets of most breaches. Comprehensive database security is commonly an overlooked area within financial services organizations, yet one of the most critical.

Databases pose a unique security challenge for banks and financial institutions of all sizes. The database infrastructure at financial services companies is usually quite extensive with many databases remaining unknown, unmonitored, or simply left completely unmanaged and worse, unsecured. It’s a common mistake for financial services organizations to have limited visibility into their database infrastructure, providing an open avenue for cyberattackers. Once inside the database infrastructure, an attacker can easily operate strategically and remain undetected, stealing records, compromising credentials and installing malware over many months.

In fact, according to KPMG’s 2016 Banking Outlook Survey published earlier this year, approximately 47% of banking EVPs and managing directors, as well as 72% of SVPs, reported they do not have insight into whether their institution’s security has been compromised by a cyberattack over the past two years. These numbers are alarming and point to a critical need for securing and monitoring databases. Any attack that reaches the core networks can put the financial institution databases and private information at extreme risk.

With breaches increasing at an alarming rate, it’s important for financial organizations to follow thorough data stewardship practices and continuously monitor all of their databases – from their initial deployment, throughout their lifecycle and into their retirement when the database is decommissioned. Monitoring needs to be detailed down to the table level to completely understand the database security profile, data ownership, purpose of the data and any changes to the data stores. Without an in-depth understanding of every database and detailed knowledge of the private data residing in databases throughout the network, it is impossible to keep data secure and prevent a serious breach. IT security personnel need to put the proper tools, policies and procedures in place.

The process starts with a comprehensive assessment of the database infrastructure. It is recommended to use non-intrusive monitoring tools to identify every database on the network and every application or user that is accessing them. Further, the database’s business purpose needs to be documented, the nature and sensitivity of the data stored in the databases determined, and proper retention policies established. It is also important to know what will be done with each database when its retention time has expired. Zombie databases that should have been decommissioned long ago are an open opportunity for attack because the database may not be properly patched, credentials may not have been updated and no one is actively monitoring the database activity.

Once policies are established and the verification of all databases is complete, financial organizations should then continuously monitor these databases throughout their lifecycle to ensure policies and procedures are updated and effectively enforced. The key to stopping serious data breaches is paying specific attention to who is using or accessing a database, how it’s being used and identifying key changes in use patterns. Identification of an unknown user or uncommon usage pattern may be a sign that there’s a malicious attacker on the network.

Zombie databases are particularly vulnerable to insider threats, advanced persistent threats and compromised credentials. Attackers can use them as an open door to get access to other databases and potentially private financial information across the network.

In a similar fashion, rogue databases can present a large and very high-risk attack surface as well. These one-off databases may have been commissioned during the development phase of a new application and connected to the network without the IT team being aware of their existence. While developers may think they are doing something innocuous, without IT going through the proper lifecycle steps, the data won’t be properly protected. Private data on these rogue databases resides outside the scope of the security team, leaving the organization highly vulnerable. Without intelligent monitoring to identify when a new database is active on the network and to check the database against current data asset inventory, it’s not possible to properly secure its data.

With so much attention focused on securing the perimeter, mobile devices and the cloud, financial services IT teams risk ignoring the security of their organizations’ crown jewels – all of the databases residing on their network. In order to prevent a serious data breach, every database needs to be identified, inventoried, continuously monitored and retired if not in use. It’s extremely critical for the protection of sensitive information for IT teams to be aware of who is accessing a database, what each database is used for, and to ensure data is protected for the lifetime of the database. Without a comprehensive database monitoring model in place, financial institutions run the risk of a serious breach of information and becoming front page news.

For more information on database monitoring Quant ICT Group, www.quant-ict.nl, glenda@quant-ict.nl,  tel:+31 880882500

Source: Blog Steve Hunt, Credit union Times