Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[RFC] Add config option to parse the payload into a map with atoms #41

Open
PhillippOhlandt opened this issue Jul 26, 2017 · 4 comments

Comments

@PhillippOhlandt
Copy link

PhillippOhlandt commented Jul 26, 2017

Hey,

I just connected my job to my existing logic and it errored quite fast, due to map keys not being atoms.

The fix was quite simple but maybe a configurable option in task_bunny would be more user-friendly.

def perform(data) do
  data
  |> Poison.encode!()
  |> Poison.decode!(keys: :atoms)
  |> Indexer.index_by_job()
end
@ono
Copy link
Contributor

ono commented Jul 27, 2017

I don't want to have implicit atom conversion on TaskBunny layer (even if it's configurable) because of the potential memory leak. Also if the users can notice if there is a configuration option, they can also do the trick with defining the custom base job.

# Caveat: I haven't tried it. Just sketching the idea.
def BaseJob do
  defmacro __using__(_options \\ []) do
    quote do
      use TaskBunny.Job
      def perform(payload) do
        for {key, val} <- string_key_map, into: %{}, do: {String.to_atom(key), val}
        |> perform_with_atom()
      end
    end
  end
end

defmodule MyJob do
  use BaseJob
  def perform_with_atom(payload) do
    Indexer.index_by_job(payload)
  end
end

However I also feel passing and taking string keyed map is kinda tedious. For example, let's imagine we have a job sending an email to user.

defmodule EmailJob do
  use TaskBunny.Job

  def perform(%{"user_id" => user_id, "email_id" => email_id) do
    ...
  end
end

iex> EmailJob.enqueue(%{"user_id" => user.id, "email_id" => email.id})

If we can write this like this, it's more readable and cleaner.

defmodule EmailJob do
  use TaskBunny.Job

  def perform(user_id, email_id) do
    ...
  end
end

iex> EmailJob.enqueue(user.id, email.id)

Since Elixir doesn't support n-ary, the implementation won't be straight forward. I haven't had a chance tackle the solution yet but maybe we can do something with macro. I also wanted to avoid over-engineering. However I share the idea because I can understand the interface of perform can be improved.

@PhillippOhlandt
Copy link
Author

Valid points.

For the atom keyed map: Maybe it's worth noting that in the readme? As you can see, my perform function is just the entry point to my real application logic and I didn't even think about the atom key issue until I saw it erroring.

For the perform function improvements: I like your idea, as long as the map version (current behavior) won't be removed. Maybe it can be solved via pattern matching and some sort of special data structure for the json object in rabbitmq to identify if it's just a map or parameters that should be passed to the perform action. Not sure how much complexity and technical debt this will introduce tho.

@ono
Copy link
Contributor

ono commented Jul 27, 2017

👍 for the document improvement. I will add the information to README and TaskBunny.Job.

@ku1ik
Copy link

ku1ik commented Jun 23, 2018

There's String.to_existing_atom which prevents memory leaks. Maybe that's worth considering (as opt-in option).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants