-
Notifications
You must be signed in to change notification settings - Fork 134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Soil data macrosys #1547
Comments
Here is the link to the Zenodo archive for all derived datasets of global soil properties (0.065km2 spatial resolution)
|
Are these datasets added to |
@Aakash3101 feel free to work on the issue. Recommend that you start from down to up |
Sure @henrykironde |
@henrykironde I wanted to clear a doubt, In the last dataset "Soil available water capacity in mm derived for 5 standard layers", I can make a single script for all the files in the dataset, right? The dataset has 7 files so when I run |
Also shall I make separate commits for each dataset or a combined commit? |
Yes all the files in the same directory. In this case, I think a fitting name for the directory would be |
@henrykironde I think this PR can be completed during my GSOC project, if I get selected, Because these files are very big indeed 😂, and I might take time to check each one, and then make a PR for the dataset added. |
Each checkbox is a single PR, I am actually working on them so do worry about the whole issue. Your goal should be to understand or get a good overview of the moving parts in the project. |
Yes, actually I am enjoying doing this kind of work as I am learning new things. |
@henrykironde I am not able to load the |
I will check this out |
Well I am also figuring out something, and it turns out that the tile size can impact the processing time. In the code for the |
Any updates @henrykironde? To me, it seems that when a tile size of
When I run the One another way to deal with this processing time issue is that if we reference the file to the database using the But this will impact the reason why we are storing it in the database in the first place because if the file is moved from the destination it should be in, the reference would not work. I had the idea for the |
@Aakash3101 what are your computational resources? |
CPU : i7 8th Gen |
Could you try to close other applications(especially IDEs), open QGIS and try to load the map. I will try it later from my end. |
I can load and view the map from the raw data file, but not from the PostGIS database. |
Yes load the data from PostGIS database and give it at least 10 minutes based on your resources. Make sure we free at least 4 gb of memory. Most Ides will take about 2gb. Closing them will enable QGIS load the data |
Okay, I will let you know if it opens. |
So this time while loading the file in QGIS, I monitored my RAM usage through the terminal and it uses all my memory. And then the application is terminated. I don't know the reasons, but I will soon find out. |
And when I open the raw data file, it uses just around 2GB of my RAM. |
When I query the table in pgadmin4 to show all the values in the table, postgres uses all the RAM, and then it freezes, so I think I need to optimize the memory available for queries. Please let me know if you find something useful to optimize the memory usage. |
Okey I think at this point you should let me handle this. It could take at least one day or two. I will try to find a way around. |
Sure @henrykironde |
Soil water content (volumetric %) for 33kPa and 1500kPa suctions predicted at 6 standard depths (0, 10, 30, 60, 100 and 200 cm) at 250 m resolution
source https://zenodo.org/record/2784001#.YDlJ02pKiBR
or
https://developers.google.com/earth-engine/datasets/catalog/OpenLandMap_SOL_SOL_WATERCONTENT-33KPA_USDA-4B1C_M_v01
citation: "Tomislav Hengl, & Surya Gupta. (2019). Soil water content (volumetric %) for 33kPa and 1500kPa suctions predicted at 6 standard depths (0, 10, 30, 60, 100 and 200 cm) at 250 m resolution (Version v0.1) [Data set]. Zenodo. http://doi.org/10.5281/zenodo.2784001"
License (for files):
Creative Commons Attribution Share Alike 4.0 International
The text was updated successfully, but these errors were encountered: