メニュー

ダイレクト系損害保険会社でのData Engineer(DATA LAKE)の求人

求人ID:1230264

募集終了

転職求人情報

職種

Data Engineer(DATA LAKE)

ポジション

担当者

おすすめ年齢

20代
30代
40代
50代以上

年収イメージ

800万円〜900万円程度 ※経験・能力を考慮の上、当社規定により決定します

仕事内容

We have embarked on an exciting journey of Digital Transformation to meet their aggressive business ambitions.
Data engineers working in our Data Lake Team carry out a wide variety of data platform and business intelligence tasks in an AWS based cloud computing environment.

●Key Accountabilities & Specific Activities
-Designing & Modeling
(with minimal supervision)
・Build the first iterations of high quality and sustainable data pipelines and ETL processes to extract data from a variety of APIs and relational databases and ingest into AWS services.
・Efficiently draft complex SQL queries and data models to aggregate and transform data for reporting and analytics teams.

-Execution and Maintenance
(with minimal supervision)
・Monitor existing solutions and work proactively to rapidly resolve errors and identify future problems before they occur.
・Own and keep the system design and operations documents up to date.

-Others
・ Consult with a variety of stakeholders to gather new project requirements and transform these into well-defined tasks and targets.

●Relations with other departments 
Mainly IT teams (Enterprise Data, Data Strategy, Architecture), Finance, Marketing etc.

必要スキル

●Requirements (1): Technical skills, Job experiences
【Minimum qualifications (Must Have)】
-3 years of practical experience in data / analytics with at least 1 year working in an engineering / BI role.
-At least 1 year of practical experience working on data pipelines or analytics projects with Python and SQL.

【Preferred qualifications】
-5 years of practical experience in data / analytics with at least 2 years working in an engineering / BI role.
-Experience with NoSQL databases (DynamoDB, MongoDB, Elasticsearch, Redis, Neo4j, etc.)
-Strong knowledge and practical experience working with at least 3 of the following AWS services: S3, EMR, ECS/EC2, Lambda, Glue, Athena, Kinesis/Spark Streaming, Step Functions, CloudWatch, DynamoDB.
-Strong experience working with data processing and ETL systems such as Oozie, Airflow, Azkaban, Luigi, SSIS.
-Experience developing solutions inside a Hadoop stack using tools such as Hive, Spark, Storm, Kafka, Ambari, Hue, etc.
-Ability to work with large volumes of both raw and processed data in a variety of formats (JSON, CSV, Parquet, ORC, etc.).
-Ability to work in a Linux/Unix environment (predominantly via EMR & AWS CLI / HDFS).
-Experience with DevOps tools (Jenkins, GitHub, Ansible, Docker, Kubernetes).

●Requirements (2): Behavioral skills
-Able to effectively communicate with internal and external technical and user teams.
-Capability to make complex ideas accessible
-Fast adaptability, Self-starter, autonomous
-Focused on solving problems
-Result-oriented.

●Requirements (3): Certificate, Education background
Master’s degree level in computer science field (or equivalent experience)

●Requirements (4): Language skills
-English (mandatory): Basic English skills for communication especially writing skill. It will be the main working language.
-Japanese (nice to have, but not mandatory)

●Other comments (if any)
Candidates with strong Python, SQL skills and experience with data modelling in a cloud environment are encouraged to apply.
AXA Direct Japan works mainly with AWS but candidates with relevant GCP, Azure or Databricks experience will also be considered.
Python and SQL skills are absolute must have requirements for this position.

就業場所

就業形態

正社員

企業名

グローバルに展開するダイレクト系損害保険会社

企業概要

ダイレクト系外資損保。

企業PR

グローバルにおいてダイレクトチャネルでの損害保険業に強みを持つ外資系損害保険会社。

組織カテゴリ

備考

関連キーワード

転職求人を検索