Develop and maintain full-stack applications that support data acquisition, including internal tools and dashboards.
Collaborate closely with cross-functional teams, including Data Processing, Architecture, and Scaling, to ensure seamless data ingestion and workflow management.
Design and implement APIs to facilitate data interactions between internal services and external data sources.
Enhance user experience by developing intuitive web-based interfaces for managing and monitoring data pipelines.
Optimize backend services for performance, scalability, and security in a distributed computing environment.
Work with legal and compliance teams to ensure our data acquisition processes adhere to privacy regulations and best practices.
Deploy and maintain infrastructure using Kubernetes and Infrastructure-as-Code (IaC) methodologies.
Analyze system performance, conduct experiments, and improve data workflows to maximize efficiency.
BS/MS/PhD in Computer Science or a related field.
4+ years of industry experience in full-stack development.
Proficiency in frontend frameworks (React, Vue, or similar) and backend technologies such as Python, Node.js, or Go.
Strong expertise in RESTful APIs, GraphQL, and database design (SQL and NoSQL).
Experience building data-intensive applications that handle large-scale datasets.
Familiarity with cloud platforms (AWS, GCP, or Azure) and container orchestration (Kubernetes, Docker).
Prior experience with web crawling and large-scale data processing is a plus.
Strong problem-solving skills and ability to balance multiple tasks in a fast-moving environment.
Excellent communication and collaboration skills.