Create your first worker
We will create a simple web scraping worker that listens for URLs to scrape, fetches the content, and logs the results.
Web scraping is a perfect use case for task queue workers, because of built in retries and ability to fetch websites in parallel.
Setting it up is straightforward and takes just a few minutes.
-
Create your worker
First, create a new Edge Function using the Supabase CLI. Then, replace its content with this code:
EdgeWorker.start(async (payload: { url: string }) => {const response = await fetch(payload.url);console.log("Scraped website!", {url: payload.url,status: response.status,});}); -
Start Edge Runtime
Start the Edge Runtime with the following command:
npx supabase functions serveThis makes Supabase listen for incoming HTTP requests, but does not start your worker yet.
-
Start your worker
Start the worker by sending an HTTP request to your new Edge Function (replace
<function-name>
with your function name):curl http://localhost:54321/functions/v1/<function-name>This will boot a new instance and start your worker:
[Info] worker_id=<uuid> [WorkerLifecycle] Ensuring queue 'tasks' exists... -
Process your first message
Your worker is now polling for messages on the
tasks
queue (which was automatically created during startup).Send a test message:
SELECT pgmq.send(queue_name => 'tasks',msg => '{"url": "https://example.com"}'::jsonb);The message will be processed immediately and you should see the following output:
[Info] worker_id=<uuid> [ExecutionController] Scheduling execution of task 1[Info] Scraped website! { url: "https://example.com", status: 200 }