Data Tech Lead
Google Cloud, BigQuery, GCS, Airbyte, SQL, DBT, Python, Flask, FastAPI, Pandas, JavaScript (React), GraphQL, Terraform, GitHub Actions, Sigma Computing, Open Metadata, Monte Carlo Data, OpenAI, LLMs
About the role
You will help define the tech roadmap and build the data platform for finance data processing. We want someone who works well in a fast-paced team, cares about good engineering, and wants to make a strong product. As a Principal Data Tech Lead, you’ll design scalable data pipelines, ensure data quality, and improve performance. You’ll work with different teams to turn business needs into technical solutions and help mentor junior team members. Good communication and sharing knowledge are important in this role. We’re looking for someone proactive, who can handle big responsibilities, drive innovation, and solve complex data problems to help the business grow.
About the project
The client, a leading venture capital firm, faces delays and data reliability issues due to siloed systems across departments. To solve this, a unified data platform will be developed, integrating Salesforce and other key systems. This will streamline data management and enable accurate, accessible analysis through advanced BI tools on web and mobile.
Who are we looking for?
Skills & Experience
- Bachelor's degree in Computer Science, Engineering, or related field
- 10+ years of experience in data engineering roles
- Demonstrate a minimum of 3 years of proficiency in Python, SQL, data systems development life cycle, and preferably full-stack development.
- Advanced hands-on experience using different kinds of databases (for example, PostgreSQL, BigQuery, Redis) including experience with query and optimization techniques.
- Proficiency with cloud platforms, preferable GCP for scalable solutions
- Database management and SQL proficiency and knowledge of modern data warehousing tools (like Snowflake, Databricks, Redshift)
- Proven track record in data integration, ETL pipeline development, and data transformation workflows.
- Strong problem-solving skills for complex data challenges, attention to detail and ability to solve business challenges by implementing data driven solutions and interpretation of high-level business needs into effective data architecture solution
- Experience with unit and integration testing methodologies.
- Hands-on experience working with Docker, good understanding of CI/ CD practices
- Strong knowledge of data governance frameworks and compliance requirements
- Strategic thinking and data strategy development
- Expertise in emerging technology and industry trends
- Proven background in collaborative efforts with product managers and fellow engineers, particularly within distributed multicultural teams.
- An excellent command of the English language, both written and verbal.
Nice to have
- Knowledge of the financial domain and understanding of wealth management, investment concepts.
- Experience with GraphQL and its integration with data platforms.
- Experience with Palantir Foundry for the development and optimization of complex data pipelines using Pipeline Builder and the creation of data models with Ontology
- Contributions to open-source projects or personal projects showcasing data engineering skills.
Responsibilities
- Involvement across the full project lifecycle, from requirements gathering and system analysis to design, development, testing, and solution deployment solutions.
- Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
- Design, implement, and maintain scalable and reliable data pipelines, data warehouses, and data lakes.
- Establish policies for data governance, including quality assurance, lineage tracking, and access control
- Mentor and guide engineers, fostering a culture of continuous learning and improvement.
- Provide guidance on best practices for database design, storage optimization, and real-time analytics
- Help maintain code quality, organization, and automation.
- Collaborate with other teams as needed to ensure interoperability, stability, and code reusability.
- Optimize data processing and querying for better performance and cost-efficiency.
- Implement monitoring, logging, and alerting mechanisms to ensure the health and reliability of the data platform.
- Collaborate with stakeholders to align data solution/platform with business goals and providing advice on the best practices
- Evaluate emerging technologies and recommend tools to enhance the organization’s data infrastructure
What we offer
Work:
- Flexible working hours;
- Collaborative, friendly team environment;
- Remote/Hybrid work;
Life:
- Company social events;
- Annual corporate parties;
Health:
- Comprehensive medical insurance;
Education:
- Allowances for professional education;
- English language courses with native speakers;
- Internal knowledge-sharing sessions.
About Proxet
Proxet is a professional software development firm trusted by clients from around the world. With our expertise in AI and machine learning, we help businesses reimagine their possibilities and transform ideas into tangible digital solutions. By providing core services with an emphasis on data practices, we shape the future, one step at a time.
If you’d like to join our Proxet Nation and work closely with high-level professionals and our engineers, fill in the form!
Our offices:
Interested? Let's get in touch!
Tell us about yourself, then leave a link or upload your resume and we will get back to you soon!
💌
Thank you!
Interested in this closed position? Let's get in touch!
Tell us about yourself, then leave a link or upload your resume and we will get back to you soon!
💌
Thank you!
Is your friend a good fit?
Tell us about your friend - send their resume to our email hr@proxet.com!
Please Note: Your friend must be aware of the referral. The referral bonus is paid out after the end of the trial period.