The security risks of building your own apps: A word of caution for hospital administrators

For hospital administrators and IT personnel who decide to create their own mobile applications – to manage this process in-house – there are a number of security concerns these individuals must recognize.

These issues can mean the difference between enhanced protection of confidential information and significant data loss. The latter is often the result of errors or flaws, which hackers can exploit for financial gain or other motives.

Before addressing the proverbial doom-and-gloom associated with a major data breach or acts of cyber crime, allow me to congratulate hospitals for taking the initiative with regard to embracing the use of mobile devices. I also respect the desire by hospital executives to build their own applications internally, as a way to ensure quality control and streamline the development process.

But hospitals are, first and foremost, centers of healing, treatment and medical research; they are not technology companies or software vendors, where their primary focus is on making smartphones and tablets more relevant for this or that group of users.

Still, hospitals can create powerful, relevant, secure and easy to use mobile tools – provided they perform a series of analyses before and after they launch their respective applications.

To start, any application must undergo thorough testing (and retesting) before its release to doctors, nurses and staff. That testing consists of both dynamic and static analysis, encompassing three main categories.

Forensic analysis allows developers to determine if (and where) a mobile device stores sensitive material like usernames, passwords, credit card numbers, electronic health records, insurance files and patient-specific data.

That analysis can be reduced to this one critical question: Is the data on that device encrypted?

Also, how does the application store information in general?

Next, there must be a network analysis involving the application. A team of specialists must know how the application interacts with a hospital's vast array of networks.

For example: Developers must find out if the application performs certificate validation and/or pining. They must do their own audit to verify that this process is in place, while also establishing how the application sends sensitive information.

These criteria will enable developers to identify an application's vulnerability to "man-in-the-middle" or SSL proxy attacks. These same benchmarks will reveal whether session timeouts follow the right protocol.

Remember, these analyses exist for a reason; they are safeguards for an industry dependent on confidentiality, where even the slightest weakness can be an invitation for hackers and cybercriminals to pounce.

As the recent hacking scandal against one of the largest health insurers in the United States demonstrates, cyber criminals can get – they have already obtained – tens of millions of pieces of data about current and former customers and employees.

All of which brings us to the third and final point of analysis: Code analysis, which decompiles an application, to conclude if memory dumps occur (to see if there is sensitive information stored within the application).

In accordance with this procedure, developers must know if an application permits proper data deletion, in addition to how that application handles geolocation data.

The manner in which an application designates files (as "world readable/writeable," to cite just one scenario) can mean any person can modify these files – for good or ill.

Resolving these challenges will result in an application that is more secure, a resource that enjoys the approval of developers and expert technicians.

Following these steps is a straightforward approach that can yield important intelligence about how well an application performs, and how well it can withstand attempts by hackers to extract priceless content.

With attention to detail – and constant testing – hospitals can build their own apps, which are safe and effective.

Andrew Hoog is the CEO of NowSecure, which provides mobile security solutions, debunks common security assumptions and creates smarter technology to ensure private information remains private and not exposed to unnecessary risks.

The views, opinions and positions expressed within these guest posts are those of the author alone and do not represent those of Becker's Hospital Review/Becker's Healthcare. The accuracy, completeness and validity of any statements made within this article are not guaranteed. We accept no liability for any errors, omissions or representations. The copyright of this content belongs to the author and any liability with regards to infringement of intellectual property rights remains with them.

Copyright © 2024 Becker's Healthcare. All Rights Reserved. Privacy Policy. Cookie Policy. Linking and Reprinting Policy.

 

Featured Whitepapers

Featured Webinars

>