Senior Erlang Developer

Work from Home  ‐ Remote
This project has been archived and is not accepting more applications.
Browse open projects on our job board.

Description

About Scrapinghub:

Founded in 2010, Scrapinghub is a fast growing and diverse technology business turning web content into useful data with a cloud-based web crawling platform, off-the-shelf datasets, and turn-key web scraping services.
We’re a globally distributed team of over 100 Scrapinghubbers who are passionate about scraping, web crawling and data science.

As a new Scrapinghubber, you will:
Become part of a self-motivated, progressive, multi-cultural team.
Have the opportunity to work remotely.
Have the opportunity to go to conferences and meet with the team from across the globe.
Get the chance to work with cutting-edge open source technologies and tools.


About the Job:

You will join our top talented engineers on the Crawlera team in making the world a better place for web crawler developers.
Crawlera is a smart downloader designed specifically for web crawling and scraping. It allows crawler developers to crawl quickly and reliably by managing thousands of proxies internally. It is part of the Scrapinghub platform, the world’s most comprehensive web crawling stack which powers crawls of over 4 billion pages a month..

Job Responsibilities:

Develop, maintain and support a high load distributed system.
Analyze our current and historical Crawlera usage to augment and enhance its routing and rotation logic.
Leverage the Scrapinghub platform to provide extended functionality, both to end users and for internal purposes.
Identify and resolve performance and scalability issues with distributed crawling at scale.
Liaison with other platform teams to provide Crawlera with the best possible integration to the growing Scrapinghub platform.
Required Skills:

2+ years of production experience with Erlang.
Good communication in written English.
Strong knowledge of Linux/UNIX, HTTP and Networking.

Desired Skills:

Python Knowledge.
Familiarity with techniques and tools for crawling, extracting and processing data.
Knowledge of ELK, Graylog, Docker and Mesos.
Good spoken English.
Previous remote working experience.

Please apply through this link - https://jobs.lever.co/scrapinghub/2a7dddd9-76f4-41f4-b55b-cb8e2641d9ab/apply
Start date
ASAP
Duration
12 months
(extension possible)
From
Scrapinghub
Published at
05.12.2017
Contact person:
Jessica Quinn
Project ID:
1465119
Contract type
Freelance
Workplace
100 % remote
To apply to this project you must log in.
Register