Job Information
Within3 Senior Python Backend Software Engineer (FULLY REMOTE- USA) in Ohio
Who We Are!
Within3 is the world leader in insights management for life science organizations. Our platform identifies the right people, actively engages them, and delivers answers that drive agile, informed decision-making.
We work hard to create a dynamic, collaborative culture where innovation is encouraged. Even as a globally distributed organization, we strive to create and maintain connections that not only make work fun but make our work better!
The Challenge
As a Senior Python Software Engineer at Within3, you will lead the development of advanced insights capabilities and support public data integration projects. Your role will involve data engineering, processing pipelines, deployment, performance evaluations, and collaboration with data scientists on algorithms and models.
The ideal candidate will be curious, compassionate, possess a strong foundation in software engineering, coupled with a passion for solving complex problems and driving innovation. This role requires a detail-oriented, inquisitive individual with excellent problem-solving skills, who is eager to integrate data insights from multiple sources and looking to contribute to the technical roadmap. By joining us, you will be part of a dynamic distributed team dedicated to solving challenging data problems and advancing our capabilities to better serve our clients.
What you can expect from the opportunity
Design, guide and advance existing, as well as new, data processing pipelines which service the company’s flagship SaaS platform
Work collaboratively with a distributed, cross-functional team to deliver modern, scalable, SaaS software
Focus on solving complex problems rather than specific tools
Be curious, empathetic and humble
Advance best practices related to software and data engineering
Write clear, extensible, maintainable and testable code
Ensure solutions are robust, scalable and efficient
Work with product owners to refine internal and external requirements; so we build the right solution to a problem
Work with QA to ensure platform stability and quality
Engage in the continued pursuit of improved skills and knowledge
Engage in mentoring, sharing knowledge and feedback
Help foster a culture of around the appreciation and understanding of data, systems and scale
Become a subject matter expert in our data domain
Requirements
What you need to grow and succeed
Required
6+ years of professional programming experience with Python
Well versed in software engineering practices including version control (git), dependency management, testing (pytest) and CI/CD
Strong foundation in data engineering and data processing pipelines
Understanding of privacy-by-design frameworks and code-level security techniques
Experience with SOA, job queues, concurrent programming and observability
Proficient with SQL (any variant)
Direct experience using at least one of the following RDBMS / OLAP / OLTP databases: PostgreSQL, MySQL, MariaDB, SQL Server, Oracle, Redshift or Snowflake
Direct experience using at least one of the following NoSQL databases: Redis, MemoryDB, DynamoDB, Neptune, Neo4j
Direct experience with at least one data pipeline framework such as Airflow, Spark, Flink, Snowflake, Kafka, Amazon SWF, Amazon EMR or similar technologies
Understanding of modern API design patterns (REST, JSON, GraphQL, etc.)
Strong collaboration and communication (verbal and written) skills
Flourishes in a team which delights in collaborating, building each other up and continuous learning
Ability to foster cross-team relationships with empathy and respect
Enthusiastic and passionate about craftsmanship, scalable systems and data
Comfortable learning new technologies
Exceptional attention to detail and strong analytical, problem-solving, and critical-thinking skills
Ability to decompose complex problems and work towards clean solutions by yourself and with a team
A pragmatic approach to solving problems and evaluating new technologies (e.g. balancing quality, craftsmanship, speed and cost)
Only Applicants that current reside and have the relevant authority to work in the USA will be considered.
Preferred
Direct experience with at least one observability tool such as DataDog, Honeycomb, New Relic
Experience with one or more of the following data libraries: Pandas, NumPy, SciPy
Experience with one or more machine learning frameworks/tools such as TensorFlow, PyTorch, Keras, Scikit-learn, SageMaker
Familiarity with data versioning tools (Delta Lake, DVC, LakeFS, etc)
Experience with Docker
Don’t meet every single requirement? Studies have shown that women, members of the LGBTQ+ community and people of color are less likely to apply to jobs unless they meet every single qualification. At Within3 we are dedicated to building a diverse, inclusive, and authentic workplace, so if you’re excited about this role but your experience doesn’t align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right candidate for this or other roles.
Benefits
What's on offer
Fully remote home-based flexible working
Create your own home working space, including a MacBook Pro and monthly communication allowance.
You will benefit from our unique approach to flexible working - Work Life Balance! And we mean it!
A competitive compensation package & benefits programme aligned to your location.
Growth & advancement in a forward-thinking and dynamic organization where culture is everything!
Don't believe us - check us out on glass door (https://www.glassdoor.co.uk/Overview/Working-at-Within3-EI_IE607750.11,18.htm?countryRedirect=true)
Within3 is committed to creating a diverse and inclusive work environment and is proud to be an equal opportunity employer. We invite you to consider opportunities at Within3 regardless of your gender; gender identity; gender reassignment; age; religious or similar philosophical belief; race; national origin; political opinion; sexual orientation; disability; marital or civil partnership status or other non-merit factor.