Job Archives

Posted 2 years ago

 Details of activities:

- Design, implement, and maintain their big data platform.
- Work with the development team to ensure that data pipelines and ingestion processes are highly available, scalable, and secure.
- Collaborate with other development team members to identify and implement solutions to complex big data problems.
- Monitor and troubleshoot big data issues, performance bottlenecks, and system failures.
- Develop and maintain big data processing scripts using tools such as Apache Spark, Hadoop Big data tools

- Develop and maintain data visualization and reporting tools.
- Implement and maintain data quality and data governance processes.
- Participate in on-call rotation and provide support for the production of Big Data systems as needed.
- Research and evaluate new tools and technologies to improve my clients' big data platform.


The skill and requirements:

  • At least 5 years of experience in Big Data or related fields and total IT experience should be more than 10 years.
  • Strong experience in Java/Python design and programming
  • Experience using AWS or GCP cloud services
  • Strong experience with big data processing frameworks such as Apache Spark, Hadoop, or similar.
  • Experience with data visualization and reporting tools such as Tableau, Power BI, or similar
  • Experience with SQL and databases such as Oracle, MySQL, or PostgreSQL.
  • Experience with data quality and data governance processes.

Salary

32000 to 38000 per year depending on experience and open to negotiate

 Details of activities: – Design, implement, and maintain their big data platform.– Work with the development team to ensure that data pipelines and ingestion processes are highly ava...

Posted 2 years ago

Salary : 28000 to 40000

We are currently looking for a highly experienced GCP (Google Cloud) Data Engineer - starting in September 2023

The Role:

- Contribute to the design and improvement of new data platform on GCP
- Working with Data Warehouse tech, such as Apache Airflow, Kubernetes & Big Query
- Build new generation tools to support the data engineering teams
- Be involved in both Technical & Product decisions

The skill and requirements:


- GCP Certified, preferable in Data Engineering
- Minimum 5 years commercial GCP experience
- Experience with MLOps and working closely with Data Scientists
- Python/Java programming skills
- Experience with BigQuery
- Infrastructure experience with Terraform or Kubernetes
- Machine Learning and AI in cloud applications
- Data Engineering/Data Warehousing experience
- Dataflow, Google PubSub, Composer
- Data Migration into Google Cloud
- DialogFlow, Auto ML, AI Platform

Salary : 28000 to 40000 We are currently looking for a highly experienced GCP (Google Cloud) Data Engineer – starting in September 2023 The Role: – Contribute to the design and improvement...