Data Engineer - Databricksunsave-jobsave-job
ScoutOut Recruitment
เหลืออีก 2 เดือน

Data Engineer - Databricks

50,000 - 80,000 บาท / เดือน



full time job





  • ทุนเพื่อการศึกษา
  • ประกันสุขภาพ
  • เวลาการทำงานที่ยืดหยุ่น
  • ประกันชีวิต
  • ทำงานจากที่บ้าน
  • เงินโบนัสตามผลงาน
  • สิทธิการเบิกค่าทันตกรรม
  • ทำงานสัปดาห์ละ 5 วัน


  • work experience
    มีประสบการณ์ ขั้นต่ำ 3 ปี
  • education


Critical Tasks

  • Work with business consultants, data scientists, data analysts and other stakeholders to deliver big data analytics or data lakehouse platforms that meet data and analytics needs of customers.
  • Design and recommend the right data architectures, inline to customer’s requirements. 
  • Design, build, test, deploy, maintain data pipelines using Spark, SQL and Python within Databricks; to unlock full potential of data analytics capabilities that enable customers and internal stakeholders to manage the business and make effective decisions.
  • Build secure, available, scalable, stable, and cost-effective data platforms using services from Databricks, AWS, Azure, GCP or other on-premise/on-cloud data platforms. Ensuring data governance, management and security.
  • Ensure work assignments of self and team under supervision are achieved on time with quality, accuracy and relevance.
  • Deliver business and technical engagements related to data with internal and external stakeholders e.g. sales, solution sales and technology/service partners and customers. This includes participating in business development, understanding customer requirements, presenting solutions, developing required materials, responding to RFP/RFI, and creating data products and applications.
  • Keep abreast of the latest in Databricks, Azure, AWS and related technology advancements as well as good practices and approaches in data lakehouse, data warehouse and big data space.
  • Work on multiple initiatives simultaneously.

Professional Experience

  • Bachelor's Degree in Computer Science/IT or equivalent.
  • 3+ years of data engineering experience with knowledge of relational databases, data warehouse and big data.
  • Strong SQL knowledge and database experience working on relational, dimensional and non-relational databases.
  • Hands-on experience in design, build and deploy the end-to-end ETL processes for complex data warehouse projects - data integration, data mapping, data transformation, data structures, metadata, data processing.
  • Experience with big data tools: Hadoop, Spark, Kafka, Python, Scala, Java or C++.
  • Experience in building data warehouse, data lakehouse or big data using SQL or ETL tools such as Azure Data Factory, AWS Glue ETL, GCP Dataflow, Talend, Informatica, Apache NiFi or Apache Airflow.
  • Experience in designing architectures and building data lakehouse with Databricks or with one or more cloud ecosystems (Azure, AWS, GCP)
  • Strong expertise in the Databricks stack and one or more cloud ecosystems (Azure, AWS, GCP)
  • Experience in integrating Databricks and Azure or AWS to support data analytics capabilities 
  • Experience of stream-based data extraction processes or API
  • Have data privacy and PDPA awareness
  • Experience in BI and Visualisation tools (e.g. Tableau, PowerBI) is a plus 
  • Familiar with Waterfall & Agile methodologies
  • Excellent analytical, problem solving, creative thinking, and planning skills 
  • A self-motivated, driven, flexible, quick learning, “can do” attitude
  • Extraordinary attention to detail
  • Able to work collaboratively in a team environment 
  • Able to communicate complex ideas effectively, both verbally and in writing, in English and Thai



The Ninth Towers Grand Rama9, ถนน พระราม 9 แขวงห้วยขวาง เขตห้วยขวาง กรุงเทพมหานคร ประเทศไทย


พระราม 9 ศูนย์วัฒนธรรมแห่งประเทศไทย เพชรบุรี งานจากบริษัทนี้


ScoutOut Recruitment
21 อัตรา · 2 สาขา | พนักงาน 51-200 คน

บริการการสรรหาที่ครอบคลุม ให้คุณเลือกได้ตามความต้องการทางธุรกิจ