About this job
About the job:
You’ll be working on the Platform team, building and maintaining our customer facing application, and tools to make the world a better place for web crawler developers.
We have established products that already have product-market fit where you’ll help to grow the business and stay up to date with market demands. At the same time, you’ll be working to iterate quickly on testing new opportunities to help determine which are worth continued investment.
We’re a data driven team that defines success by business result rather than completion of a task. Finally, being a completely remote company with team members in many different time zones, you’ll excel in this role as an independent thinker that can always find a way to move projects forward, even if you might be the only team member online at that time.
- Take ownership of projects, and independently drive them from prototype to completion
- Build composable, reusable components for our complex SPA
- Design and improve the backbone of a large scale web crawling platform
- Strive to build easy to maintain systems and improve existing systems
- Be proactive in bringing forth new ideas and solutions to problems
- Be a strong team player and share knowledge freely and easily with your co-workers
- Write code carefully for critical and production environments
- Good knowledge of Python, MySQL and HBase
- Backend web development experience using Django and Flask
- Experience with any distributed messaging system (Rabbitmq, Kafka, etc.)
- Strong knowledge of Linux & system programming
- Docker container basics
- Understanding different ways of solving problems, and the ability to wisely choose between a quick hotfix, a long-term solution, or a design change
- Being comfortable with Git and team-based Git workflows
- Excellent communication skills, both written and verbal, in English
- Experience developing RESTful web APIs
- Experience with real-time communication in webapps
- Experience using Celery
- Asynchronous programming experience using Python (asyncio, twisted, etc.)
- Familiarity with techniques and tools for crawling, extracting and processing data, asynchronous communication and distributed systems.
- Familiarity with Apache Mesos, Kubernetes, RabbitMQ, Kafka, Zookeeper
Bonus points for:
- Experience working remotely or with a distributed team
- Experience with ASGI and Django Channels
Life at Scrapinghub
Scrapinghub is a fast growing and diverse technology business turning web content into useful data with a cloud-based web crawling platform, off-the-shelf datasets, and turn-key web scraping services.
We’re a globally distributed team of 170 Shubbers working from over 30 countries who are passionate about scraping, web crawling, and data science.
As a new Shubber, you will:
Become part of a self-motivated, progressive, multi-cultural team.
Autonomy to make the role your own, supported by great people
Join a team with huge opportunity to make a difference
Have the freedom and flexibility to work from wherever you want.
Have the opportunity to go to conferences and meet with the team from across the globe.
Get the chance to work with cutting-edge open source technologies and tools.
- Flexible working hours
- Remote working
- Paid time off
- Paid open source work
- Global team meet ups
- Learning & Development Opportunities