Who we are:
Gideon Brothers develops autonomous, collaborative and modular robot platforms and autonomy technologies for unstructured, indoor, human environments. Our robots are designed to assist operations in various industries. Our current team includes 9 doctors and 40 Masters of hardware and software engineering and related disciplines. We are one of the largest robotics and autonomous technologies groups in Southeast Europe. Gideon Brothers has offices in Zagreb and Osijek. We are looking for a DevOps/Infrastructure Engineer for the both offices.
Scope of work:
In this position, you will be focused on systems infrastructure to help manage engineering requirements and anticipate potential issues by identifying possible collision in dependencies among various teams and their technology components. We are also looking for a person that will work on designing tools for deployment and validation test workflows.
- Integration and on-boarding of full autonomy-stack solution.
- Work and plan with engineering teams to anticipate potential issues, identify dependencies.
- Build critical tools to manage our deployment and validation test workflows.
- Design and implement best practices for security, monitoring and logging systems.
- Manage, store and present test results and data acquired from our robot fleet.
- Preparation of system documentation.
- Several years in software or DevOps engineering roles.
- Experience writing production software using C/C++, Python or similar languages.
- Strong UNIX/Linux background and networking fundamentals.
- Knowledge of shell scripting languages (Bash).
- Experience working with Docker development and deployment workflows.
- Experience with cloud infrastructure providers and supported technologies.
- Identify and act on opportunities to improve processes and increase efficiency.
- Excellent communication skills and fluency in English.
- Work experience in robotics software and environments, including ROS, OpenCV, PCL, etc.
- Experience managing physical compute, storage, and networking hardware.
- Experience with Hadoop, Spark, or other data processing tools.
- BSc or MSc in Computer Science/Engineering or related field.