Lead Data Architect
No matter what your job title, the work you do at Kaiser Permanente supports the health and well-being of our members. All 8.7 million of them. That’s because each of us—from our financial professionals and IT team members to our RNs and physicians on the front line of care—shares a commitment to providing the best possible care experience. With locations across the United States, we offer the opportunity to build a rewarding career in an environment that supports your success. Join us and put your beliefs into practice.
Kaiser Permanente is currently seeking a hands-on Data Architect to support their Compliance, Legal and Regulatory Affairs (CLRA) business partners. The Data Architect will work with large healthcare datasets and will translate client's business requirements into specific systems, applications, or process designs for large complex IT solutions while integrating I.T. and business architectures. Participation and contribution in Industry, KP, and KPIT wide initiatives is also expected in the areas of data usage and governance, information management, privacy and security, SOA, data analytics and visualization, information modeling, and big data.
This individual will be part of a dynamic team that builds and implements Data Analytic solutions that serve as key decision support systems for the leadership of the company. A key for success in this role is a unique blend of technical skills, business knowledge and working with and influencing multiple cross functional teams within IT and business on the right solutions.
Roles & Responsibilities
Strategy & Planning
- Develop and deliver long-term strategic goals for data architecture vision and standards in conjunction with data users, department managers, clients, and other key stakeholders.
- Create short-term tactical solutions to achieve long-term objectives and an overall data management roadmap.
- Create end-to-end vision on how logical design translates into one or more physical databases, and how the data will flow through the successive stages.
- Establish methods and procedures for tracking data quality, completeness, redundancy, compliance and improvement.
- Assess and determine governance, stewardship, and frameworks for managing data across the enterprise.
- Create strategies and plans for data capacity planning, data security, life cycle data management, scalability, backup, disaster recovery, business continuity, and archiving.
- Ensure that data strategies and architectures are in regulatory compliance.Recognize and identify potential areas where existing policies and procedures require change, or where new ones are needed
Project Related Deliverables and Tasks
- Select and implement the appropriate tools, software, applications, and systems to support data technology goals.
- Oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality.
- Collaborate with project leads and consultants and business unit leaders for all projects involving enterprise data and analytics.
- Create and maintain data model and metadata policies and procedures for functional design
- Provide technical recommendations and engage with ETL/BI Architects, Business SMEs and other stakeholders throughout the Solution/Data Architecture and implementation lifecycle and recommend effective solutions to develop high performance and highly scalable data solutions (data marts/warehouse and data mining and advanced analytics)
- Address data-related problems in regards to systems integration, compatibility, and multiple-platform integration.
- Develop and implement key components as needed to create testing criteria in order to guarantee the fidelity and performance of data architecture.
- Document the project and programs data architecture and environment in order to maintain a current and accurate view of the larger data picture, an environment that supports a single version of the truth and is scalable to support future analytical needs
- Identify and develop opportunities for data reuse, migration, or retirement and platform upgrades.
- Communicate with customer, project team in a timely manner and escalates issues & risks appropriately.
- BS or MS in Computer Science/Business Information Systems/Engineering
- 10 or more years of IT experience
- Expertise in Data Architecture, Data Strategy and Roadmap for large and complex organization and systems and implemented large scale end-to-end Data Management & Analytics solutions for more than one large client
- Expertise with normalized OLTP, MDM and DW Dimensional modeling techniques, Star & Snowflake schemas, modeling slowly changing dimensions and role playing dimensions, dimensional hierarchies, and data classification
- Expertise in Data Quality, Data Profiling, Data Governance, Data Security, Metadata Management, MDM, Data Archival and Data Migration strategies using appropriate tools
- Hands-on experience with related/complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef)
- Hands-on experience with ETL (Extract-Transform-Load) tools (e.g. Informatica, Talend, Pentaho)
- Hands-on experience with BI tools and reporting software (e.g. Microstrategy, Cognos Business Objects)
- Hands-on experience with analytical tools, languages, or libraries (e.g. SAS, SPSS, R, Mahout)
- Experience with high-scale or distributed RDBMS (Teradata, Netezza, Greenplum, Aster Data, Vertica)
- Drive delivery in a matrixed environment working with various internal IT partners
- Demonstrated ability to work in a fast paced and changing environment with short deadlines, interruptions, and multiple tasks/projects occurring simultaneously
- Must be able to work independently and have skills in planning, strategy, estimation, scheduling,
- Strong problem solving, influencing, communication, and presentation skills, self-starter
Nice to Have
- Hands-on experience with the Hadoop stack (e.g. MapReduce, Sqoop, Pig, Hive, Hbase, Flume)
- Experience withoverseeing "productionalizing" Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning)
- Knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce) and considerations for scalable, distributed systems
- Knowledge of NoSQL platforms (e.g. key-value stores, graph databases, RDF triple stores)
Greenwood Village, CO 80111