Website Token Metrics is Hiring

Job Title. Big Data Engineer

Responsibilities

  • Collaborate with coworkers and clients to understand project requirements.
  • Design and implement infrastructure for accessing and analyzing big data.
  • Optimize existing frameworks for enhanced functionality.
  • Test and validate data structures to ensure usability.
  • Build data pipelines from various sources (API, CSV, JSON, etc.).
  • Prepare raw data for manipulation by Data Scientists.
  • Implement data validation and reconciliation methodologies.
  • Ensure secure data backup and accessibility for relevant team members.
  • Stay updated on industry standards and technological advancements.

Requirements

  • Bachelor’s degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field.
  • A Master’s degree in a relevant field is an added advantage.
  • 3+ years of experience in Python, Java, or other programming languages.
  • Proficient in SQL & No-SQL (Snowflake Cloud DW & MongoDB experience is a plus).
  • Strong experience in schema design and dimensional data modeling.
  • Expertise in SQL, NoSQL, Python, C++, Java, and R.
  • Building Data Lake, Data Warehouse, or equivalent experience.
  • Proficient in AWS Cloud.
  • Excellent analytical and problem-solving skills.
  • Ability to work independently and collaboratively.
  • Effective pipeline management with minimal supervision.

About Token Metrics

Token Metrics assist crypto investors in building profitable portfolios through AI-based crypto indices, rankings, and price predictions. Serving a diverse customer base globally, Token Metrics caters to retail investors, traders, and crypto fund managers in over 50 countries.

Shares: