Providence, one of the US's largest not-for-profit healthcare systems, is committed to high quality, compassionate healthcare for all. Driven by the belief that health is a human right and the vision, 'Health for a better world', Providence and its 121,000 caregivers strive to provide everyone access to affordable quality care and services.
Providence has a network of 51 hospitals, 1,000+ care clinics, senior services, supportive housing, and other health and educational services in the US.
Providence India is bringing to fruition the transformational shift of the healthcare ecosystem to Health 2.0. The India center will have focused efforts around healthcare technology and innovation, and play a vital role in driving digital transformation of health systems for improved patient outcomes and experiences, caregiver efficiency, and running the business of Providence at scale.
Why Us?Competitive PaySupportive Reporting RelationJD- Principal Data Engineer How is this Team contributing to the vision of Providence: We, at Enterprise Services, the healthcare consulting and services arm of Providence India, help build technology solutions that modernize and simplify each step of the healthcare delivery process. And we do that by putting the patient and the provider at the centre of everything we do. Using the most promising and practical ideas, combined with the experience and expertise from people from the healthcare industry, we are creating experiences that work for care facilities, their patients and move us ahead on our mission of "Health for a better world".
What will you be responsible for? Develop effective and high-quality healthcare program integrity analytics that meet business requirements.Lead the Data Engineering strategy and delivery across global projects and products.Ensure the Data Engineering function works as one team, promote excellence and technical development, and adhere to key principles and processes.Guide and are responsible for strategic decisions with the support of a team of senior leads.Communicate and gather the required technical information from cross-functional teams in aligning to develop a Unified Data Portal.What would your day look like? Collaborate with DevOps teams to implement CI/CD pipelines, automated deployments, and infrastructure as code (IaC) practices for AWS-based solutions.Document design, development, and deployment processes, as well as create technical specifications and user guides for developed solutions.Collaborate with data architects and business stakeholders to understand data requirements and create scalable solutions.Use a systematic approach to plan, create, and maintain data architectures while keeping it aligned with business requirements.Prepare and deliver results to leadership with analytic insights, interpretations, and recommendations.Analyze and organize raw data.Build data systems and pipelines.Evaluate business needs and objectives.Interpret trends and patterns.Conduct complex data analysis and report on results.Prepare data for prescriptive and predictive modelling.Build algorithms and prototypes.Combine raw information from different sources.Explore ways to enhance data quality and reliability.Identify opportunities for data acquisition.Develop analytical tools and programs.Collaborate with data scientists and architects on several projects.Utilize healthcare data to achieve administrative needs and goals.Understand data storage and data sharing methods.Understand healthcare business operations.Utilize different data sources for analyses.Who are we looking for? 6-10 yrs. of professional work experience preferable in management consulting or high growth start-ups preferably in healthcare.3+ years of experience in a data analytical role.Experience in building a unified data system for a Data portal.Bachelor's degree in mathematics, statistics, healthcare administration, or related field.Experience in Hive, SQL, Databricks, Snowflake and Alteryx.Familiarity with data modelling, data warehousing, and data integration concepts.Ability to design and implement ETL pipelines that can extract data from various sources, transform it, and load it into the data warehouse or data lake.Automate the data workflows in establishing the integration with a wide range of Enterprise data systems like SQL server, Oracle, MySQL, Snowflake.Design, Develop, and Maintain Data pipelines of Unified portal with multiple databases from upstream systems (e.g., Azure data factory, DataBricks, AWS Glue).Build Connectors, Extract information through APIs using Data Extraction libraries.Familiarity with agile development methodologies and experience working in Agile teams.Familiarity with big data technologies like Hadoop, Spark, and Kafka.Expertise with visualization tools (e.g., Power BI, Excel).Knowledge of data management applications.Analytical mindset with good problem-solving skills.Excellent written and verbal communication skills.
#J-18808-Ljbffr