zlacker

[return to "PostgresML is 8-40x faster than Python HTTP microservices"]
1. chaps+b5[view] [source] 2022-10-20 01:45:53
>>redbel+(OP)

  "In Python, most of the bottleneck comes from having to fetch and deserialize Redis data."
This isn't a fair comparison. Of freaking course postgres would be faster if it's not reaching out to another service.
◧◩
2. redhal+c7[view] [source] 2022-10-20 02:09:12
>>chaps+b5
Yes, that's essentially the point being made here. It's a fair comparison if your intent is to run this kind of job as quickly as possible.
◧◩◪
3. pushed+4a[view] [source] 2022-10-20 02:40:00
>>redhal+c7
I also don’t think it’s a fair comparison. There’s nothing stopping me from loading the model into the memory of each Flask process (or some shmem), and getting the same performance or possibly better than the Postgres implementation, considering coroutines are being used in the Python case.

Calling this Postgres vs Flask is misleading at best. It’s more like “1 tier architecture vs 2 tier architecture”

◧◩◪◨
4. montan+rg[view] [source] 2022-10-20 03:47:32
>>pushed+4a
You get it. 1 tier is better than 2 tier. Python can't be 1 tier, unless it loads the full dataset which is not generally feasible for production online inference cases. PostgresML is 1 tier, and supports the traditional Python use cases.
◧◩◪◨⬒
5. xapata+bj[view] [source] 2022-10-20 04:25:21
>>montan+rg
Why can't Python be 1 tier? It's a general-purpose, extensible language. It can do anything that PostgreSQL can do.
[go to top]