Added 11 days ago

Data Engineer


Job typePermanent


CategoryInformation Technology

Experience5-7 Years

IndustryInformation Technology

Job Responsibilities

• Design, create and maintain optimal data pipeline architecture

• Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, and AWS ‘big data’ technologies or Google Big Query

• Ability to build processes that support data transformation, workload management, data structures, dependency and metadata

• Assemble large, complex data sets that meet functional / non-functional business requirements.

• Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

• Perform the data preparation for data model (data cleansing, data aggregation)

• Design and develop reliable, stable, and effective data marts to support business.

• Create ETL jobs and data pipelines

• Monitor data quality to meet SLA

• Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

• Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Experience requirements

• At least 5 years of experience as a Data Engineer.

• Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases including SQL and NoSQL databases

• Hands-on experience with SQL database design

• Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.

• Experience with object-oriented/object function scripting languages: Python, Java, C++, etc.

• Experience with build processes supporting data transformation, data structures, metadata, dependency and workload management.

• A successful history of manipulating, processing and extracting value from large disconnected datasets.

• Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.

• Strong project management and organizational skills.

• Experience supporting and working with cross-functional teams in a dynamic environment.

• Good logical thinking, hard-working, positive attitude, good communication skills.

Education requirements

• University graduate in Computer Science, Statistics, Informatics, Information Systems or another quantitative field, prioritizing graduating from Technology Universities, FPT University or other Tech University.

Contact Person

  •   Tu Nguyen
  •  Adecco