samedi 28 mars 2015

Postgres function optimisation

I am running the below function. Testing it with a much smaller table works as expected (18 rows - ~400ms). However, when pointed at my real data (315000 rows), it is running 48hrs and still going. This is much longer than I expected under a linear extrapolation.



  1. Is there a better way to do this?

  2. Is there a way to test if it is doing what it should be doing while still running?


Is there any way to optimise the below function?



DO
$do$
DECLARE r public.tablex%rowtype;
BEGIN
FOR r IN SELECT id FROM public.tablex
LOOP
IF (select cast((select trunc(random() * 6 + 1)) as integer) = 5) THEN
UPDATE public.tablex SET test='variable1' WHERE id = r.id;
ELSIF (select cast((select trunc(random() * 6 + 1)) as integer) = 6) THEN
UPDATE public.tablex SET test='variable2' WHERE id = r.id;
END IF;
END LOOP;
RETURN;
END
$do$;




Aucun commentaire:

Enregistrer un commentaire