As a follow-on to the recent post focused on best practices and recommendations to promote data sharing in government from a technology perspective, this post concentrates on the same but from a policy point of view. The intention of the Policy Panel was to discuss common issues, barriers, and challenges Government organizations experience in regards to policy and organizational culture surrounding data sharing. The panelists were also encouraged to discuss successes in policy change, policy implementation, their opinions on how policy should be developed and written, and what should be considered when altering policy to better data access and sharing.
1. Characteristics of a good policy
Many aspects go into what makes a good policy from the focus and goals of the policy, to the literature and legibility of the documentation to who has been tasked to write and preside over the policies development. The focus of the policy should be on the outcome not the output. If data is viewed as a mission asset it will drive the policy to focus on the aspects that are important to the outcome based on the organization’s mission. Yet, the policy should not be written or influenced by those who execute the missions themselves. Those who are implementing the mission outputs, could possibly be writing the policy to benefit themselves, creating a solution that does not benefit the true mission of the organization; this is also known as “self-dealing.” If the data itself isn’t focused as being the most important aspect of the system then costs that were “unforeseeable,” will become visible and additional burdens that may arise, such as data loss or distortion can be prevented. Policy also needs to take care of data by following data lifecycle not organizational life structure. This means the data needs to undergo continuous maintenance, while keeping aware of the preservation of the data, preventing loss or distortion, until the data life cycle is complete. This policy focus will lead to the data to continuously be more thoughtful and oriented towards the current and up to date management methods in terms of storage, cataloguing, documentation, and discoverability of the data. Revisiting the policy and making changes when necessary to align with mission and enable capability to support analytics.
Policy also needs to be driven from the top down, and needs to be standardized across the organization. When a policy is enacted it needs to be disseminated properly and taught indiscriminately throughout the organization. If it is not properly disseminated and accurately taught through-out the organization, then it will only lead to confusion of what the policy stands for and how it should be implemented at every level of the organization. This will lead to the policy not becoming standardized, “A standard isn’t a standard because it’s mandated, it’s a standard because it is used” (John Eberhardt quoting the CEO of HL7). For an organization to clearly clarify the policy, it first needs to be physically written down, too often are policies dictated by rumor and overly careful practice of employees. Furthermore, the policy needs to have a common understanding and written in a legible common language, not written in legal writing format with complex logic. Additionally, the policy should not be overly prescriptive and should only cover the minimum necessary policy. In the past policies, have been written with a large amount focus in detailing any contingencies that may occur, leaving the policy to be inflexible and inconsistent with the rise and development of new technology and practices such as big data, cloud computing and Dev Ops. Finally, new policies should always have respect for privacy, civil rights, and civil liberties.
2. When developing policy, data should be viewed as an asset
Begin looking at data being an asset for use and not from the IT perspective. Traditional provenance of data policy looks towards data architecture, technology and database management opposed to focusing on the use of data. Being in the mindset of separating the data from the system when considering data policy, aids in creating more efficient, effective, and better preserved data assets. Continuously purchasing data, reestablishing data architecture and rebuilding data bases, can be avoided if the focus of the policy is on managing the data resources currently in place. Bringing in data SMEs who know which data matters and IT specialists to provide tools, aids in identifying common problems and eliminate redundancies and inconsistencies from already established enterprise systems. Cost is reduced when data is refined into a renewable resource, and when enterprise systems are flattened by reusing existing systems and eliminating non-necessary components in the overall process. Furthermore, data policy in the form of cross domain sharing and promoting access can be driven with the mindset of data as a strategic asset. The data becomes more easily discoverable and accessible when good practices of standardization and translating the data into a common language.
3. Policy used as a force for organizational and cultural change to encourage and facilitate data sharing
Exposure and demonstrating the benefits of data sharing, is the best practice for using policy as a force for organizational and cultural change. Emplacing data policy that mandates accessibility to data, or gives the ability to share data between organizations, breeds new analytic capabilities, increasing mission effectiveness. This type of data policy change needs to be driven from top down. When policy is implemented from top down, it changes the way the cultural mindset as they perform its mission essential tasks through the analysis of newly accessible data. Regretfully, it is usual those who create policies that are slowing the process of data sharing policy change. Policy leaders are hesitant to allow access to “their” data and aren’t meeting with other leaders to properly establish data policies that allow seamless cross domain sharing. In the effort to change this organizational and cultural mindset, President Barack Obama issued an executive order, “Making Open and Machine Readable the New Default for Government Information,” on 09 May, 2013. Following this the “Open Data Policy Memorandum”, or M-13-13, was signed and provides guidelines on how to make the executive order feasible by 2026. The memorandum states: all data by default has to be open and machine readable, All Government agencies have to maintain a single portal for their data, every agency has to create and maintain a data inventory, CIOs have greater oversight of data quality. This policy will drive the change for open and shareable data but the culture will still have issues transitioning. Once it is in place and the data policy is enforced from top down, the cultural and organizational mindset will eventually change as well.
4. How costs of data can be extraordinary in the Government
Data can become expensive without proper resource management, preservation, or cataloguing. Often organizations must repurchase their data because they never registered the data product. Organizations pay for the rebuilding of databases, the migration of data, new tools to access data, applications where data is handled, etc. If an organization isn’t focusing policy on data as a strategic asset they may not understand the complexity of their system. This leaves room for inconsistencies and redundancies in the same data being handled by secondary and tertiary applications, driving cost beyond anticipation. Flattening architecture, and focusing on data standardization, consistency, reliability and maintaining the right pedigree will reduce cost in later years. Utilizing SCOPs, to perform these actions will make data easily accessible and in a common format, leaving out the necessity to rebuild their enterprise architecture and databases when transitioning to a data sharing environment. Furthermore, not preserving data during its life cycle may result in data distortion or data loss, if this occurs the loss is priceless because the data’s value can never be evaluated or used to its potential. Data policies should be in place to prevent the loss and distortion of data. For the organizations utilizing the data focusing on data as a strategic asset and performing resource management, can make data much more valuable. Proper metadata documentation, eliminating redundancies, reducing inconsistencies, and translating to a common language allows data sharing, access, and discovery to be much cheaper for the organization.
I hope that you will find this helpful in better understanding the challenges and opportunities in government data sharing and maybe even use this to be more effective and efficient in what you do to move these type of initiatives forward.
About the Author: Craig Parisot is the CEO of Advanced Technology Applications (ATA, LLC) of McLean, Virginia focused on full stack data science engineering providing strategy, infrastructure, analytic and security solutions in multiple industry sectors serving commercial and government clients. Craig is also the Organizer of the Full Stack Data Science Meet-up in collaboration with Data Community DC and an angel and growth stage technology investor.
Best Practices and Recommendations to Promote Data Sharing in Government (from a TECHNOLOGY perspective)
January 11, 2017
The Quantum Leader
March 10, 2016
Perspective on Policy: Promoting Data Sharing in Government