Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

exactly. calling the find_in_obj() function can get fairly expensive, especially given the need to execute JSON.parse() for every call.

there is definitely a lot of room for improvement in the postgres native JSON toolkit. i am hoping that building this exposes more of those issues and helps move it forward.



I haven't tried it, but I think you would be able to use a PostgreSQL functional index (i.e. indexing the result of a function call to extract a particular bit of data from the JSON.)

http://www.postgresql.org/docs/9.1/interactive/indexes-expre...


I was just coming here to say exactly this. You could also use a partial index (that is, an index with a WHERE clause; so, "WHERE json_field LIKE '%foo.bar = baz%'" or whatever).


you can, but in order to build a partial index, the function needs to be immutable. not exactly ideal in this case.


Immutability is perfectly appropriate for that use-case, though. Creating an index via a function that may return different outputs when provided the same input is always going to lose.


But the same is true for MongoDB. If you're querying an attribute of a document that isn't indexed, Mongo has to scan all of the Documents in the collection for it. The solution to this is to have additional indexes on any attributes you want to be able to query by. This holds true both in Mongo and this Postgres implementation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: