You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
conda env create -f env.yml cannot complete in the AWS SageMaker JupyterLab when using ml.t3.medium (that instance has 4gb of RAM).
Not all of that 4gb is available, but generally between 2 and 3 gb are available.
Here's a graph of conda memory usage (on my laptop)
The red line is running conda env create on an exported env file, while the blue is the default env file as it would normally be written. So even with that, it is still pretty much hitting the limit of available ram on the small instance.
There is conda-lock...I haven't tried it but it's possible that may be more frugal with ram. However, the big spike is in the verifying transaction stage rather than in the environment solving stage, so I'm not sure if it would really make a difference.
There's a chance we could get the memory usage down even more, but I'm going to need to table this and move on for now.
The text was updated successfully, but these errors were encountered:
Actually, I couldn't resist. Here are some more tests, this time on AWS.
The best runtime and memory is by specifying versions in the env file, as well as setting the channel priority to strict and only using conda-forge and defaults. It's nice for the runtime, but doesn't help the memory issues too much.
conda env create -f env.yml
cannot complete in the AWS SageMaker JupyterLab when usingml.t3.medium
(that instance has 4gb of RAM).Not all of that 4gb is available, but generally between 2 and 3 gb are available.
Here's a graph of conda memory usage (on my laptop)
The red line is running conda env create on an exported env file, while the blue is the default env file as it would normally be written. So even with that, it is still pretty much hitting the limit of available ram on the small instance.
There is conda-lock...I haven't tried it but it's possible that may be more frugal with ram. However, the big spike is in the verifying transaction stage rather than in the environment solving stage, so I'm not sure if it would really make a difference.
There's a chance we could get the memory usage down even more, but I'm going to need to table this and move on for now.
The text was updated successfully, but these errors were encountered: