-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
v2.0.0 #90
base: master
Are you sure you want to change the base?
Conversation
* remove deprecated vector/vectorize parameters * remove bulkWrite * remove deleteAll * removed namespace terminology * removed db.collections() * removed client.db(id, region) * update api report
* some experimental table typing * dark magic * added default type for table * made InferTableSchema more flexible * moved table.ts into its own folder (mirrors collections) * more type errors * broke up table.ts file in to its proper structure * more work on tavles idk * start work on common command impls class * fixed couple rebasing errors * moved all collection functions to generic internal CommandsImpls object * fixed all of the bugs I introduced in the previous commit :) * more implmeennation work * added table methods for db as well * added types (but not impl) for alterTable * add countRows * add more missing table functions * added various datatypes (dates & ip addrs) * update build report * update build report * some datatype tweaks (mostly for InetAddress) * some type operations when translating cql types
* various cursor tweaks & fixes * minor typing tweaks * fixed bug with filter being potentially muitable * made cursors immutable * some tests not all idk
* started work overhauling validation logic * so much validation * some folder resturcuting * loggign hierarchy * basic logging impl * implement warning events * tests for logging and such
* documented table-schema file * tsdoc work * remove checkExists * ser/des work * playground script * started adding custom inspects and work on datetimes * made _httpClient fully public * work on DataAPIVector stuff * some resturcturing * made all Promise<true> methods just reutrn Promise<void> * removed rackstackk hack * cqlblob type * additionalHeaders * started documenting table-related stuff * formatting for events logging * documentation for logging * document collection class * created CursorError * set up test suite for table tests * move dropIndex to db level * lot of work on bignumbers hack... * remove CollectionNotFoundError * added table.deifnition() * super basic insertOne test * basic findone tests * toomanyrowstocount error * start documenting serdes * split cumulative errors + some more bignumber serdes work * changed $PrimaryKeyType to be a string + some test typing fixes * added sparse data support
* make DataAPIDbAdmin keyspace options no longer extend AdminBlockingOptions * refactor raw db info into more workable objects * more unified naming convention of admin interfaces * bit of name tweaking * some admin info itnerface tweaking * light, temp documentation
* split cursor classes * little bit of cleanup * moved cursors test to documents/collections * unit tests for split cursors * integration tests for split cursors
* timeouts overhaul * keyspace impl & tests * more tests & tweaks * fix timeout/sort bgus * refined WithTimeout types * createCollection custom timeout impl * more docs and stuff
* a bunch of tests work * couple minor test fixes
* more intuitive naming for events/logging stuff * formatting + timestamps for log messages * dropIndex ifExists * sourceModel * createindex options resturcuting * split filter & update types * remove cql from datatypes names * timeout for cursor.toArray() & coll.distinct() * added class names for admin event name sourcse * remove "spawn" from spawn type names
* updated serdes * changed tokne provider a bit * minor linting fix * fixed couple import/typing issues + tables readme * add shorthand datatype functions * update readme w/ shorthand datatype fns * marked mroe internal values * removed deeppartial & strict filters/sorts/projs * tiny bit of renaming
* updated serdes * add shorthand datatype functions * update readme w/ shorthand datatype fns * marked mroe internal values * start documentation of tables * added listIndexes * some altertable fixes * bump min node version to v18+
* advanced typings for tables/colls; typings for includeSimialrity * fixed tables typing test file * update readme
* switch to codec-based ser/des system * camel snake case interop * example and many bug fixes and tweaks and stuff idk lol * added serdes path matching & class-mapping example
* some fixes/work * some ttyping work * added tsdoc everyuwhere for the msot part * reset example astardbts versions * few more tests * minor uipdates to examples
* update uuid stuff + start datatypes tests * reexport bignumber * some datatype tests
* unficiation of codec types * snakeCaseInterop => keyTransformer * some unit tests on ser-des options
Overall looks reasonable, just had a couple of suggestions/thoughts:
|
The Anyways, you shouldn't need the
Oh yeah I forgot about this case, I'll add some checks that'll basically say "hey you inserted a collection doc into a table and it succeeded but we failed to parse the response since you're using the table class", or, for finds, "hey you tried to find on a table but it was actually a collection so we failed". Thanks for pointing this one out 👍 If you have any other thoughts/suggestions, no matter how nitpicky or small, please do share |
* start work on more docs md pages * created colls/tables datatype cheatsheet * did collections dts * readme work * ran npm audit fix * minor tweaks to DATATYPES.md * bnunch of tsdoc for create-table related types * tsdoc for DataAPILoggingDefaults * update examples to use @next version of astra-db-ts * documented list-tables
A couple of other issues I ran into:
|
|
Re: DataAPITimestamp point (2), I would recommend automatically serializing JS dates to DataAPITimestamp because they're fundamentally the same type: integer containing milliseconds since epoch. Not much reason to make a distinction between the two. Is there even a reason to have a separate DataAPITimestamp type? Right now sending a JS Date to Some more notes:
|
Regarding #1 & #2, the main reason for the separate I'm leaning towards throwing a readable error (agree that it's currently not readable) that says "either use a I'm on the fence about serializing dates but still reading them back as a
Oops, bug, will fix in next preview release. If you really want a quick fix for implementation/testing purposes, you can use type TableIndexDescriptor = Awaited<ReturnType<InstanceType<typeof Table>['listIndexes']>> extends (infer T extends object)[] ? T : never;
This came from the Data API team themselves. On the Data API as well, The idea is to make it clearer that index names exist within the keyspace, and not the table, so it reduces the chance of something dropping a keyspace that they didn't mean to. This one's out of my control 🤷
No, there is complete
Actually, when someone uses However, this library is written purely in JS, unlike the native JSON module, which is typically implemented in highly-optimized C++, and is therefore decently slower. Plus, it sets a null prototype for each JS object, so I need to recursively fix the proto for each parsed object and its nested objects as well TL;DR: it's a bit of a mess because the native JSON module doesn't support big numbers, and the Data API doesn't want to allow big numbers to be read as, or even written as, strings; instead, they're insistent on having bignumbers be raw JSON number literals, since it's technically allowed in the spec. Can expand if necessary, but I can't really enable bignumbers by default for collections (I technically could for serialization, but not for deserialization; however, I'd rather have it explicitly on/off instead of having some half/half behavior). I'll try to throw a readable error here as well for bignums on collections that don't have bignums enabled |
Re: BigInts, I looked a little further into it, and here's where my misunderstanding is:
|
oh you're right... I forgot about that case... I suppose
No, there very much is. I can't say that the code for it is the cleanest since it's a pretty messy situation in the first place that had to be retrofitted since I didn't expect the Data API to go this route, but it exists. This is in const serialized = (info.bigNumsPresent)
? this.bigNumHack?.parser.stringify(info.command) // optional since it's not present in DataAPIDbAdmin usages. Not super happy with the loose invariant, but it is what it is
: JSON.stringify(info.command); Example: > await table.insertOne({ text: '1', int: 1, varint: 1231231222132132131231231231231231231231232132133n })
{ insertedId: { text: '1', int: 1 } }
> await tfa
[
{
varint: 1231231222132132131231231231231231231231232132133n,
int: 1,
text: '1'
}
] |
Also for what it's worth, I also strongly made the same argument for accepting and returning though to be fair, I guess the issue still would've existed for |
Actually a known bug you may be running into is if you're using BigNumbers in tables, but using your own version of the library instead of the one reexported by astra-db-ts, the code won't pick up that it's a BigNumber (since I have a fix for this in a local branch already. |
25a764d
to
8ee74e2
Compare
One more point I noticed:
|
As far as I'm aware, the implementation still doesn't work if the vectorize provider fails; still waiting on the Data API to fix that before it can be safely exposed for general use. |
No description provided.