We are building a SaaS product which allows any company to perform any kind of data analysis & ML tasks on top of their data stored within their data silo all while being encrypted at all times. Our goal is to create a product which secures data at all times, even during ML operations. In order to do so we are using the newest frameworks and are creating a highly innovative product. Our perfect team member is someone who wants to create fast, secure, concurrent & optimised code.
- Work with various cloud platforms to run our infrastructure.
- Develop processes, CI/CD pipelines, and general backend of our data analysis product.
- Optimise, stress-test & constant benchmarking of our code.
- Code reviews & input with the whole team on where to take the infrastructure next.
- Deep understanding of AWS, GCP, IBM stack.
- >=2 years of experience working with Go.
- Experience with networking protocols, distributed systems.
- Experience working with some of the most popular DBs PostgreSQL, BigQuery, S3, Snowflake, etc.
- Some understanding or previous experience with ML.
Great to have:
- Experience with AWS Sagemaker.
- DevOps experience.
How to apply
Please send resume/short introduction || Github/StackOverflow to firstname.lastname@example.org
Does this job really require Go skills? If not, please let us know so that we can keep the jobs all about Go.