Job Title: Senior Data Engineer Company Overview: Prompt is revolutionizing healthcare by delivering highly automated and modern software to rehab therapy businesses, the teams within, and the patients they serve. As the fastest growing company in the space and the new standard in healthcare technology, we're looking to bring on a Senior Data Engineer to take our Business Intelligence platform, Internal/ External Data Pipelines, Data Migration Services, and healthcare software to the next level.
This role is truly an unprecedented opportunity to have immense impact in expanding our growth for the next chapter of Prompt. We are looking for someone who will be a foundational piece to the growth of the product, team, and strategy at Prompt.
Why work for Prompt? BIG Challenges: Here at Prompt, we are solving complex and unique problems that have plagued the healthcare industry since the dawn of time.
Talented People: Prompt didn't happen by chance, it's a team of incredibly talented and proven individuals who all made their mark before joining forces to build the greatest software on the planet for rehab therapists.
Healthy Approach: This isn't an investment bank, we work long hours when it's needed, but at Prompt you own your workload and the entire organization takes a liking to smart work (over hard work).
Positive Impact: Prompt helps outpatient rehab organizations treat more patients and deliver better care with less environmental waste. That means less surgery and less narcotic-based pain treatment, all while turning a paper-heavy industry digital. We aren't enthralled with patting ourselves on the back everyday, but it does feel good :)
The Role Lead the design, development, and maintenance of data engineering solutions that transform clinical, operational, and financial data into actionable insights. This role combines technical expertise with healthcare domain knowledge to drive data-driven decision-making across the organization while ensuring compliance with healthcare regulations and data privacy requirements.
The job role might also require you to learn new tools and technologies fast, and you should have in-depth database knowledge as well as basic programming and scripting skills. You will help to build efficient and stable data pipelines which can be easily maintained in the future. You should have expertise in the design, creation, management, and business use of large datasets.
Responsibilities Data Architecture Design: Creating robust and scalable architectures to manage the flow and storage of data across the organization.
ETL/ ELT Processes: Developing and managing Extract, Transform, Load processes to ensure data is accurately integrated from various sources into data warehouses or lakes.
Data Pipeline Development: Constructing automated pipelines for data processing and transformation, ensuring smooth data flow and timely availability for analysis.
Database Management: Administering databases and ensuring their performance, integrity, and security.
Data Quality and Governance: Implementing data validation, cleansing, and governance practices to maintain high-quality and reliable data.
Collaboration: Working closely with AI engineers, BI engineers, Analysts, and business stakeholders to understand data requirements and provide support in their analytical tasks. Participate in code reviews and architecture discussions to exchange actionable feedback with peers. Contribute to engineering best practices and mentor junior team members. Help break down complex projects and requirements into sprints.
Performance Optimization: Monitoring and optimizing data systems for improved performance and efficiency.
Tool and Technology Integration: Evaluating and integrating new tools and technologies to enhance data management capabilities.
Skills 10+ YOE working with cloud based databases, data lakes, data warehouses (S3, RDS, AWS Athena, AWS Redshift etc.)
Proficiency in data engineering tech stack; for example; Athena / Redshift / Glue / MySQL / Python / Spark / Kafka / SQL / AWS / Airflow/ DBT / containers and orchestration (Docker, Kubernetes) and others
Experience and understanding of distributed systems, data architecture design, and big data technologies
Experience with AWS technologies ( e.g., AWS Lambda, Redshift, RDS, S3, DMS, Glue, Kinesis, SNS etc.)
Knowledge of data quality management, data governance, and data security best practices
Organizational skills: time management and planning
Good knowledge of DevOps engineering using Continuous Integration/Delivery tools like Kubernetes, Jenkins, Terraform, etc., thinking about automation, alerting, monitoring, security, and other declarative infrastructure
Experience with managing multiple data pipelines for internal and external data products/ pipelines
Familiarity with Laravel, PHP and JS frameworks is good to have
Perks - What you can expect: Competitive salaries
Remote/hybrid environment
Potential equity compensation for outstanding performance
Flexible PTO
Company-wide sponsored lunches
Company paid disability and life insurance benefits
Company paid family and medical leave
Medical, dental, and vision insurance benefits
Discounted pet insurance
FSA/DCA and commuter benefits
401k
Prompt Therapy Solutions, Inc is an equal opportunity employer, indiscriminate of race, color, religion, ethnicity, ancestry, national origin, sex, gender, gender identity, sexual orientation, age, marital status, veteran status, disability, medical condition, or any other protected characteristic. We celebrate diversity and are committed to creating an inclusive environment for all employees.