What version of runtime environment does a .NET 5 application actually needs?
Our team deals with a legacy application that relies on a rather old deployment process and infrastructure:
- we deploy the application on an environment very similar to the production called Clone. Clone is a Windows Server that has (among other versions) a version of .NET 5.0.x (e.g. .NET 5.0.16)
- after the package is reviewed, someone is deploying it to the production
- the application failed to start with an Event Viewer error mentioning that it requires .NET 5.0.13. This came as a big surprise as we are not using anything beyond .NET 5.0.6
- this was fixed by installing .NET 5.0.16 also on production
Now, it is not clear what actually happened. My assumption is the following:
- NuGet package restore on Clone choose a version closer to what was available on the server (.NET 5.0.13) despite the application not requiring more than .NET 5.0.6
- since the package was build with .NET 5.0.13 as a dependency, it failed to start on a server that has a lower version (patch)
However, I am unable to find any documentation to explain if my assumption is correct or not.
How does NuGet (and possibly msbuild, not sure if it has any involvement in this) deal with the versions in this case?