Big Data Team Lead


Требования: teamlead
ai bigdata lead etl aws python qa design designing collaboration education rdbms aurora postgres oracle sql presto hadoop hive spark hbase kafka talend informatica airflow lambda english pyspark pandas
Английский: eng: Upper-intermediate

Purpose of the job:

Our client is a mission-driven organization. They are comprised of employees who are passionate about keeping people safe. Simply stated, they put people first, whether it’s protecting those visiting the places where people gather or through the support and development of our own employees.
Our product is the world’s first and only touchless security screening solution that meets all of the post-pandemic security screening requirements. It is up to ten times faster and far more effective than last generation products because it uses new sensor technology and artificial intelligence. Unlike traditional metal detectors, it’s powered by AI Software Platform enabling the system to spot weapons while ignoring harmless personal items. It keeps high volume entrances flowing by reliably detecting guns and other weapons as visitors walk through at a natural pace while carrying their phones and bags. In a BigData Team Lead role, you will manager a cross-functional team, build an ETL solution based on AWS services/Python, which involves data transformation, applying models and loading for further analysis.


  • Lead a cross-functional team consisting of backend developers and QA
  • Design, develop, deliver and operate scalable, high-performance data processing software
  • Work proactively on the system architecture
  • Creating, fixing and improvement ETL's, procedures, functions based on Big Data solutions
  • Ensure high quality development standards (unit/integration tests, etc.)
  • End-to-end data flows checks
  • Designing database schema according to data processing needs
  • Collaboration with the product management team to incorporate the needs of our customers



  • TL experience for at least 2 years
  • At least 5 years of experience data warehouse and multi-dimensional database design and development using formal methodologies
  • At least 5 years of experience with at least one RDBMS (AWS Aurora, Postgres, Oracle, SQL Server, etc.)
  • Expert in SQL writing and tuning techniques
  • At least 2 years of experience with Big data tools (Presto, AWS Athena, AWS Glue, Hadoop, Hive, Spark, HBase, Kafka, etc.)
  • Experience with at least one ETL tool (Talend, Informatica, Kettle, etc.). ETL/pipelines orchestration using Airflow or Prefect.
  • Python knowledge
  • AWS Lambda is desirable
  • Strong troubleshooting experience
  • Upper-intermediate level of English


  • Pyspark, Pandas, databricks/koalas
  • DevOps experience for Big Data solutions
Получать похожие в Телеграм
Получать похожие в Телеграм