About this position
You will be part of a small cross functional team to achieve the company's goals, through pipelines implementation moving information across systems, building tools to collect and visualize data, both real time and batch processing. This position requires both strong technical and communication skills.
Some of your responsibilities will include
Create analytical reports, provide wide and in-depth analysis to be used by all client areas;Application of statistical models to identify behaviour of our customers, and provide solutions using them;Ensure data-driven approaches to make decisions;
Ensure data quality;
Build reliable data pipelines to move data;Develop messaging protocols across systems;Ensure data availability to entire company from many sources;
Develop and maintain systems using Java, Scala and Python;
Build solid solutions using cutting edge technologies;Collaborate with enterprise architecture, solutions architecture;Implement challenging parts of the 99's ecosystem.
You will deal with
150+ servers in AWS cloud;500k+ requests per minute;Many distributed and complex systems;Real impact in thousands of lives.
What we're looking for
3+ years of software development experience with solid knowledge of large scale systems using Java/Scala/Rails/Python;
3+ years as data scientist, BI analyst or quantitative analyst;
Must have excellent technical understanding of client server fundamentals and enterprise architecture patterns, not only REST;
Must have technical understanding of different databases and its uses (MySQL, Postgres, Elasticsearch, Cassandra);
Strong knowledge in cloud native solutions - AWS (ex: EC2, Beanstalk, SQS, SNS, Load Balancer, RDS, Dynamo);Has been worked with messaging systems like Kafka, RabbitMQ;
Has been worked with big data technologies like Storm, Samza, Hadoop, Presto, Airflow;
Must have strong SQL skills;
Experience with columnar databases like Amazon Redshift;
Solid Java or Scala or Rails or Python ou R programming skills;
Solid test oriented approaches to solve business problems;
Experience with statistical methods (clustering, regression analysis, principal component analysis, GARCH, ARIMA, etc)
Experience with applied research skills to solve business problems.
Additional Eligibility Qualifications
Bachelor's Degree in Information Systems (IS), Information Technology (IT), Computer Science, Engineering, Mathematics or Physics from an accredited college or university;
Have been tested other protocols than HTTP - Thrift, MQTT;
Interested in math and algorithms.
Frameworks: Play, Akka, Slick;Databases: MySQL, Postgres, Redis, ElasticSearch, DynamoDB;
Cloud: AWS, Docker, Elastic Beanstalk, S3, SQS, SNS, Lambda;
Tools: Splunk, NewRelic, Crashlytics, Tableau.
Preferimos CVs em PDF ou links para portfólio e linkedin.