Releases: ctdk/goiardi
v0.11.10 - Tomato Catchup
This announcement actually includes the info from both the 0.11.10 release and the 0.11.9 release as well. For a small variety of reasons, including the fact that these past two releases were both pretty small and because I've been cranking on the Chef Server 12 support lately, I never got around to drafting formal releases and binaries. At last, here they are. There's a fix to remove a dependency that was being removed from Debian, and two small changes to support newer versions of the chef tools.
From the CHANGELOG:
0.11.10
-------
* It was brought to my attention by @joerg that 'create_key' is now a valid JSON
hash key for when creating clients with newer chef tools. Added it to the
whitelist of valid elements for creating clients. Thanks again for bringing
it to my attention.
0.11.9
------
* Follow chef RFC041. Newer chef-clients and knifes (knives?) were breaking
goiardi wasn't following this RFC by reporting the supported api versions.
(Thanks julian7 for the PR and bringing it to my attention.)
* Remove dependency on golang.org/x/exp/utf8string - it's failing some tests on
the Debian build servers, and goiardi's liable to get kicked out of testing
shortly if it isn't addressed. Since goiardi wasn't actually using utf8string
for anything real complicated, it was easy enough to tear that out and whip
up a replacement with built-in functions. (Thanks jordi for bringing this
situation to my attention as well.)
v0.11.8 - Coelacanth
While there are other sticks in the fire, both goiardi related and otherwise, here's another release with tweaks and improvements to brighten the days of anyone who happens to use goiardi.
From the CHANGELOG:
* Made some small tweaks and updates to the depsolver to make that a little
better. NB: Down the road, there may be some further changes to the depsolver,
especially where 'most constrained' cookbooks are concerned.
* Update the circleci config to use the version 2.0 syntax.
* Add purging old sandboxes that have been hanging around for too long.
v0.11.7 - Amfortas
Yet another bugfix release that was brewing for an awfully long time, but the fixes in this release improve memory usage, the sizes of some database tables by allowing more control over how long to keep reports and node statuses and how much information to store in the event log tables, handle some of the more unusual search types I didn't even know anyone used with Chef, and more.
From the CHANGELOG:
* Allow access to /debug/pprof with a whitelist of IP addresses
* Properly index arrays of hashes, arrays of arrays, etc. in object attributes.
* Pretty serious memory usage improvements with search (both the in-memory
and postgres searches).
* Fix reconnecting to serf if the connection is somehow interrupted.
* Fix negated range queries (it turns out they *do* have a use after all), and
refactor how NOT queries are handled generally.
* Add options to purge old reports and node statuses.
* Add option to skip logging extended object information in the event log.
* A handful of other bugfixes.
* Bump up to using golang 1.9.3 for builds.
* Minor changes to the documentation.
v0.11.5 - Spookay!
There's just some search fixes in this release. Important, but not especially interesting.
From the CHANGELOG:
* Several search fixes:
- With postgres search:
* Fixed reindexing after it broke with the previous update that eliminated
a lot of unneeded extra rows in the database.
* Fixed basic queries with NOT statements.
* Separately, fixed using NOT with subqueries. On a somewhat complicated
note, but in a way that appears to match standard Solr behavior, when
doing a query like "name:chef* AND NOT (admin:true OR admin:bleh)" it
works as is, but when a negated subquery is followed by another basic
query statement, it needs to have extra parentheses around the NOT +
subquery, like "name:chef* AND (NOT (admin:true OR admin:bleh)) AND
public_key:*". A convoluted and unlikely scenario, but it could happen.
- With in-memory search:
* NOT + subqueries was also broken with the in-mem search. The fixes for
the pg-search partially fixed it for in-mem in that it no longer made the
server panic, but it was returning incorrect results. Additional work
ended up being needed for in-mem search.
v0.11.4 - Fahrenheit Forever
Mostly internal changes here that don't radically affect normal usage. The biggest change from an external perspective is that goiardi now supports Chef auth v1.3, and many endpoints will respond to HEAD requests with the same HTTP status code as they would respond to a GET request (even if it might not be a particularly sensible response). Otherwise these changes are mostly done with the future in mind.
From the CHANGELOG:
* Move the custom goiardi error type out of util and into its own module.
Wrappers around the new module are in util still for convenience, and
because the functions and interface are used all over the place.
* Many endpoints now handle HEAD requests where appropriate. With some
endpoints this is not especially useful, but with others it's a lightweight
way to see what resources exist and so forth. Implements Chef RFC 090.
* Start using contexts with requests. This does mean that goiardi will require
at least go 1.7. (As of 0.11.3 goiardi only supported go 1.7+, but it was
likely to build with somewhat older versions anyway.)
* Minor bugfixes - deal with a possible race condition with the in-mem search
index, change some logging statements from Info to Debug that didn't need to
be Info level and removed a test log statement that was no longer necessary,
updated copyright dates.
* Add the Chef API version header to responses.
* Change behavior if the data file and use-(mysql|postgresql) are specified
together; formerly it was a fatal error, but now it'll just emit a warning
in the error log and ignore the data file setting.
v0.11.3 - Menaces with Spikes
Mostly housekeeping this time, but it's good housekeeping. There's one fairly important bug fix where, if you had arrays with duplicate items in objects being indexed, goiardi would crash trying to index it. Surprisingly, this didn't come up very often. Also, it was surprisingly difficult to duplicate the crash, with lots of false starts and ideas.
Otherwise, the biggest things are new command line flags so you can configure MySQL or Postgres without using the config file, being able to configure almost all command line flags via environment variables now (which should make running goiardi in docker a bit easier), and optionally trimming the length of values in objects stored in the index. Other than that, there was some documentation cleanup and tweaks to packaging.
From the CHANGELOG:
* Add an option to trim values in search indexes. Currently not enabled by
default, but will be in the next minor goiardi release (so, either 0.12.0 or
1.0.0, depending on which ends up being next). Existing indexes ought to be
reindexed upon upgrading, but they should still work if this is skipped.
* Fix a bug where duplicated items in slices in objects being indexed with the
in-memory trie based index would cause goiardi to crash. For good measure,
even though it isn't necessary to prevent a crash remove those same
duplicate items from objects being indexed with the postgres index.
* Mark --use-unsafe-mem-store as deprecated. In the unlikely event someone's
using that option, a warning will print in the log. This option may be
removed at any time.
* Allow setting configuration options via environment variables. (See
the documentation for the details.)
* Finally allow configuring MySQL or PostgreSQL connection options with
command line flags (or, now, environment variables).
* Fixed format issues and wording in a few places in the documentation, along
with updating the docs for the current version.
* Add a hidden flag to generate a simple man page.
* Add that simple man page, along with the html docs, to the packagecloud.io
packages.
* Add a Dockerfile to allow running the local goiardi source in docker.
* Add Debian "stretch" and Ubuntu "yakkety yak" to the distro versions we have
in the package repository.
v0.11.2 - Wieland der Schmied
A small release, with only two changes:
- Fix a bug with escaped characters in certain searches (thanks @ickymettle). Does require rebuilding the search index.
- Allow using 'novault' as a build tag to avoid having to have the vault api present when building goiardi. Not relevant to most people. This greatly simplifies the Debian package, however.
That's about it.
v0.11.1 - Dimetrodon
Not a big release in and of itself, but these changes seemed worth getting released sooner rather than later to make it in before the stretch freeze if possible.
The most important change, generally, is the reworking of reindexing - now, reindexing jobs won't pile up on top of each other, and the jobs are split into smaller, hopefully more manageable chunks. Storing secrets in vault is pretty neat too, though.
The CHANGELOG, this time:
- Allow storing secrets (client & user public keys, shovey signing private
keys, and user password hashes) in an external service. Currently only vault
is supported. - Rework reindexing to break it into smaller chunks and ensure that only one
reindexing job can run at a time. - Package goiardi for RHEL 7 and Debian jessie for s390x. Rather experimental,
of course.
v0.11.0 - Steel Bacon
It's been a while, but here's a new goiardi release with (IMHO) some pretty good stuff.
From the CHANGELOG:
- Ability to upload cookbooks to S3.
- Add script to upload local files to S3 to migrate.
- Change how items are indexed with the postgres indexer, to reduce the number
of rows in the search_items table substantially (at the cost of possible
differences in search results in a few weird corner cases). - Search parser no longer chokes on Unicode. Unfortunately Postgres' ltree
module does not accept all Unicode alphanumeric characters as valid still. - Use vendoring.
- Rejigger the package building process a bit - changing how the different
packages are built and how version numbers are determined. - Fix a long-standing annoyance where the log file would get truncated when
goiardi started or restarted. - Allow passing environment variables to goiardi through the config file.
- Fix in-memory indexer to work with go 1.7.
- Add packages for CentOS 6 and 7. Also use a gox fork pulling in someone's PR
with better ARM support until that gets merged upstream eventually. - Change the postgres columns using the 'json' data type to use 'jsonb'
instead. This is generally better, but does mean that goiardi now requires
PostgreSQL 9.4 or later.
As noted, goiardi now needs at least PostgreSQL 9.4 to be able to run. (In-memory and MySQL users are of course unaffected.) The search index will need to be rebuilt with knife index rebuild -y
. The new indexing scheme uses roughly half as many rows in the search_items
table as before, though.
Also, thanks to golang 1.7, this release includes an experimental goiardi binary for Linux on z/Architecture. It's totally untested so far, but if all goes well maybe there'll even be a real package for it.
Convenient debs and rpms of goiardi can be found at https://packagecloud.io/ct/goiardi, as always.
Now development on 1.0.0 should actually get going for real, once, erm, it gets synced up again with master.
v0.10.4 - Numbers Station
Not a huge release or anything, but it adds some interesting stuff (in my opinion).
From the CHANGELOG:
- Export pprof info over HTTP, but only accept connections from localhost for
that information. - Add statsd metrics for things like chef-client run timings (requires
reporting) and started/succeeded/failed, number of nodes, API endpoint
timings, various pieces of runtime info like GC pauses, RAM used, and number
of resources updated & total resources for client runs. - Fix JSON decoding issue where very large numbers would suddenly turn into
floats.
The JSON number bug required a pretty specific set of circumstances to cause trouble; you needed a very large number (not a string), and it needed to be used somewhere where the very large number turning into a float would not be parsed correctly. Still, it can definitely be a problem, so you should probably upgrade.
The other items are more interesting. Goiardi now exports pprof info via /debug/pprof
, but for safety's sake only to localhost. Even better, though, is how goiardi can send statistics about itself to statsd, which in turn can send it on to something like graphite so you can make nice visualizations about goiardi. See the relevant docs for more information.