ダイレクト系損害保険会社でのData Engineer(DATA LAKE)の求人
求人ID:1230264
募集終了
転職求人情報
職種
Data Engineer(DATA LAKE)
ポジション
担当者
おすすめ年齢
20代
30代
40代
50代以上
年収イメージ
800万円〜900万円程度 ※経験・能力を考慮の上、当社規定により決定します
仕事内容
We have embarked on an exciting journey of Digital Transformation to meet their aggressive business ambitions.
Data engineers working in our Data Lake Team carry out a wide variety of data platform and business intelligence tasks in an AWS based cloud computing environment.
●Key Accountabilities & Specific Activities
-Designing & Modeling
(with minimal supervision)
・Build the first iterations of high quality and sustainable data pipelines and ETL processes to extract data from a variety of APIs and relational databases and ingest into AWS services.
・Efficiently draft complex SQL queries and data models to aggregate and transform data for reporting and analytics teams.
-Execution and Maintenance
(with minimal supervision)
・Monitor existing solutions and work proactively to rapidly resolve errors and identify future problems before they occur.
・Own and keep the system design and operations documents up to date.
-Others
・ Consult with a variety of stakeholders to gather new project requirements and transform these into well-defined tasks and targets.
●Relations with other departments
Mainly IT teams (Enterprise Data, Data Strategy, Architecture), Finance, Marketing etc.
Data engineers working in our Data Lake Team carry out a wide variety of data platform and business intelligence tasks in an AWS based cloud computing environment.
●Key Accountabilities & Specific Activities
-Designing & Modeling
(with minimal supervision)
・Build the first iterations of high quality and sustainable data pipelines and ETL processes to extract data from a variety of APIs and relational databases and ingest into AWS services.
・Efficiently draft complex SQL queries and data models to aggregate and transform data for reporting and analytics teams.
-Execution and Maintenance
(with minimal supervision)
・Monitor existing solutions and work proactively to rapidly resolve errors and identify future problems before they occur.
・Own and keep the system design and operations documents up to date.
-Others
・ Consult with a variety of stakeholders to gather new project requirements and transform these into well-defined tasks and targets.
●Relations with other departments
Mainly IT teams (Enterprise Data, Data Strategy, Architecture), Finance, Marketing etc.
必要スキル
●Requirements (1): Technical skills, Job experiences
【Minimum qualifications (Must Have)】
-3 years of practical experience in data / analytics with at least 1 year working in an engineering / BI role.
-At least 1 year of practical experience working on data pipelines or analytics projects with Python and SQL.
【Preferred qualifications】
-5 years of practical experience in data / analytics with at least 2 years working in an engineering / BI role.
-Experience with NoSQL databases (DynamoDB, MongoDB, Elasticsearch, Redis, Neo4j, etc.)
-Strong knowledge and practical experience working with at least 3 of the following AWS services: S3, EMR, ECS/EC2, Lambda, Glue, Athena, Kinesis/Spark Streaming, Step Functions, CloudWatch, DynamoDB.
-Strong experience working with data processing and ETL systems such as Oozie, Airflow, Azkaban, Luigi, SSIS.
-Experience developing solutions inside a Hadoop stack using tools such as Hive, Spark, Storm, Kafka, Ambari, Hue, etc.
-Ability to work with large volumes of both raw and processed data in a variety of formats (JSON, CSV, Parquet, ORC, etc.).
-Ability to work in a Linux/Unix environment (predominantly via EMR & AWS CLI / HDFS).
-Experience with DevOps tools (Jenkins, GitHub, Ansible, Docker, Kubernetes).
●Requirements (2): Behavioral skills
-Able to effectively communicate with internal and external technical and user teams.
-Capability to make complex ideas accessible
-Fast adaptability, Self-starter, autonomous
-Focused on solving problems
-Result-oriented.
●Requirements (3): Certificate, Education background
Master’s degree level in computer science field (or equivalent experience)
●Requirements (4): Language skills
-English (mandatory): Basic English skills for communication especially writing skill. It will be the main working language.
-Japanese (nice to have, but not mandatory)
●Other comments (if any)
Candidates with strong Python, SQL skills and experience with data modelling in a cloud environment are encouraged to apply.
AXA Direct Japan works mainly with AWS but candidates with relevant GCP, Azure or Databricks experience will also be considered.
Python and SQL skills are absolute must have requirements for this position.
【Minimum qualifications (Must Have)】
-3 years of practical experience in data / analytics with at least 1 year working in an engineering / BI role.
-At least 1 year of practical experience working on data pipelines or analytics projects with Python and SQL.
【Preferred qualifications】
-5 years of practical experience in data / analytics with at least 2 years working in an engineering / BI role.
-Experience with NoSQL databases (DynamoDB, MongoDB, Elasticsearch, Redis, Neo4j, etc.)
-Strong knowledge and practical experience working with at least 3 of the following AWS services: S3, EMR, ECS/EC2, Lambda, Glue, Athena, Kinesis/Spark Streaming, Step Functions, CloudWatch, DynamoDB.
-Strong experience working with data processing and ETL systems such as Oozie, Airflow, Azkaban, Luigi, SSIS.
-Experience developing solutions inside a Hadoop stack using tools such as Hive, Spark, Storm, Kafka, Ambari, Hue, etc.
-Ability to work with large volumes of both raw and processed data in a variety of formats (JSON, CSV, Parquet, ORC, etc.).
-Ability to work in a Linux/Unix environment (predominantly via EMR & AWS CLI / HDFS).
-Experience with DevOps tools (Jenkins, GitHub, Ansible, Docker, Kubernetes).
●Requirements (2): Behavioral skills
-Able to effectively communicate with internal and external technical and user teams.
-Capability to make complex ideas accessible
-Fast adaptability, Self-starter, autonomous
-Focused on solving problems
-Result-oriented.
●Requirements (3): Certificate, Education background
Master’s degree level in computer science field (or equivalent experience)
●Requirements (4): Language skills
-English (mandatory): Basic English skills for communication especially writing skill. It will be the main working language.
-Japanese (nice to have, but not mandatory)
●Other comments (if any)
Candidates with strong Python, SQL skills and experience with data modelling in a cloud environment are encouraged to apply.
AXA Direct Japan works mainly with AWS but candidates with relevant GCP, Azure or Databricks experience will also be considered.
Python and SQL skills are absolute must have requirements for this position.
就業場所
就業形態
正社員
企業名
グローバルに展開するダイレクト系損害保険会社
企業概要
ダイレクト系外資損保。
企業PR
グローバルにおいてダイレクトチャネルでの損害保険業に強みを持つ外資系損害保険会社。
業務カテゴリ
組織カテゴリ
備考
データベースエンジニアの求人情報
外資系金融機関の求人情報
損害保険の求人情報
転職体験記
- これまでの経験を活かして、ECサイト、決済サービス運営ベンチャー企業へ(50代/男性/専門学校卒)
- キャリアアップを希望、テクノロジーとコンサルティング双方を持つITコンサルティング企業へ(30代/男性/専門学校卒)
- 証券会社から有名メガベンチャー資本の急成長フィンテック企業へ(40代/男性/私立大学卒)
- 予てから希望していたIT業界へ(30代/男性/県立高校卒)
- 地方での転職、ソフトウェアの品質保証、テストサービスを主力事業とするIT企業へ(40代/男性/私立大学卒)
- 幅広い経験を活かして、日本を代表する電機・通信機器メーカーへ(30代/男性/私立大学卒)
- ソフトウェアの品質保証、テストサービスを主力事業とするIT企業へ(40代/男性/私立大学卒)
- 信頼できるコンサルタントからの案件を丁寧に活動、日本を代表する電機・通信機器メーカーへ(50代/男性/私立大学卒)
- スキルアップを目指し、クラウド黎明期から市場を牽引し続けるベンチャー企業へ(30代/男性/国立大学卒)
- IT領域での経験を活かして、大手日系生命保険会社へ(30代/男性/国立大学院卒)