Future of framework dependencies #80
Replies: 36 comments
-
As a business, you never know when a client is going to come through the door and tell "hey, that cool project you have on windows? it would be nice if you can run it on Android, or in Windows XP" (which is something we've been actually asked) Performance is important, but what's the point of a project if it cannot be used? so that answers the question: compatibility goes before performance. As you say, the ideal solution would be multiple targets, but it's a lot of maintenance for a single developer. It's happening to me too in my projects. BepuPhysics seems already very efficient and performant, so I would stick with NetStandard2.0.... I don't think it's worth to loose several platforms for the sake of gaining 15% performance. |
Beta Was this translation helpful? Give feedback.
-
I'm not using bepu 2 (and unfortunately maybe I will never use it, as my current project may be my last solo project and I decided to stick with bepu 1) but supposing I would, my requirements would be working nicely with PS4 and Switch and able to work side by side with MonoGame (it does now). I don't know how this exactly translates as answers to your questions, but I suppose I'm also in the compatibility bandwagon. |
Beta Was this translation helpful? Give feedback.
-
.NET Core 3 all the way. |
Beta Was this translation helpful? Give feedback.
-
I'd vote for net standard 2.1 atm but since it's something for the long run then yeah net core 3 imho. |
Beta Was this translation helpful? Give feedback.
-
It seems the reason why bepuphysics2 exists and is so different from v1 is about squeezing the best performance out of .Net, and .NET Standard 2.1 does not bring too much in that area. Or does it ? So what about branching the current state as .NET Standard 2.0 and only applying critical bug fixes there, then driving master toward .NET Core 3 ? Would a compile-time compatibility shim for intrinsics be complicated, for where intrinsics are not supported ? I don't have older projects to maintain so am obviously biased... :-) |
Beta Was this translation helpful? Give feedback.
-
Branching a stable version doesn't sound like a good idea at this point imho. |
Beta Was this translation helpful? Give feedback.
-
Right. Another option then is to branch the current to become bepuphysics v2.1 or v3, that achieves the same effect and does not matter much. Only depends on what path @RossNordby likes to follow. |
Beta Was this translation helpful? Give feedback.
-
I should mention that the engine is pretty much guaranteed to, at some point, adopt platform dependent intrinsics for my own purposes. It's just a question of when and what I do about legacy at that point. For example, if I wait long enough that .NET 5 becomes the de facto standard and mostly unifies mono and coreclr, there would likely be little point in trying to maintain .NET Framework compatibility. Unity might take a while longer, but... it's effectively unsupported right now anyway. (Is anyone stuck using the legacy .NET Framework with bepuphysics v2? Could you describe your situation? And do you think that situation will last into late next year with .NET 5?)
Not too much (unless I'm wrong about the platform dependent intrinsics being absent). MathF's inclusion does help clean up a few scalar paths a little. |
Beta Was this translation helpful? Give feedback.
-
I have toyed with the idea of a custom tool to do some very simple codegen without some of the nastiness of text templates, but then I have to slap myself because I already have an infinitely long to-do list :) Depending on the JIT, it may be possible to have zero overhead JIT-time specialization. Things like |
Beta Was this translation helpful? Give feedback.
-
It is not my personal case, but for example, MonoGame is trying hard to keep compatibility with Net45, because apparently, most console SDKs that support c# are stuck in Net45. And apparently they move forward into adopting newer versions really slowly. MonoGame has a weird build system just because that, the public repository has the code for most "public" platforms, but then there's a private repository, for which you need to have a console developer ID, that has the code that targets the consoles (closed source under NDAs). So, whenever someone sends a pull request upgrading some framework to a newer version, it's immediately rejected because it might break the consoles code. Given that BepuPhysics is clearly aiming for videogames, it could be a waste if it blocks itself from consoles. so I would suggest you to ask some developers working with consoles to know more about the console frameworks life cycle so you know what to expect and which frameworks are/will be supported. |
Beta Was this translation helpful? Give feedback.
-
Considering the lack of maintained C# physics engines, bepu looks really promising for adoption on a project I'm working on. However, I'm planning to not only target UWP, but also WASM using Uno. I know the earlier doesn't support .NET Standard 2.1 yet though. I'm also not sure how that would effect WASM support either, though I'm not sure if anyone has tried bepu in WASM yet? |
Beta Was this translation helpful? Give feedback.
-
I haven't seen it- it would be an interesting experiment given the heavy reliance on vectorization. I assume it wouldn't vectorize yet (unless the transpilers and WASM SIMD proposals are further along than I thought), but the scalar fallback may still have acceptable performance if it can handle the compile-time branching properly. |
Beta Was this translation helpful? Give feedback.
-
I'm using .Net Standard libraries, targeting UWP. A few people are saying .Net Core 3. There is almost zero reason to target .Net Core. It's not a good desktop platform. By the time it is, it'll be called .Net 5. Microsoft are still working on a lot of the solutions that will unify UWP and Desktop with .Net Core. UWP will get .Net Standard 2.1 at some point, not sure when, but I don't think it will be too far away. .Net Standard is the recommended way to write libraries. That will change the day .Net 5 implements .Net Standard, not a day before. |
Beta Was this translation helpful? Give feedback.
-
Hi we are currently using it in a WPF application, so we are stuck for a while in .net framework, so .net standard 2.0 is absolutely needed for us |
Beta Was this translation helpful? Give feedback.
-
@impoetk Out of curiosity, have you investigated .NET Core 3's WPF support? |
Beta Was this translation helpful? Give feedback.
-
@reflectronic I don't think it's that simple.
I don't think I'd even try using something like ClangSharp - SharpDX wasn't even able to be supported because of effort involved. So how am I going to generate my own version of SharpDX when an opensource project run by an expert couldn't even maintain it. And if it was so simple, wouldn't Microsoft be providing C# DirectX? Thanks for the suggestion, but it's hard enough making a game on my own right now, I don't see how I could replace SharpDX in my spare time .. no one else has managed to do it have they? I've seen one or two efforts and they are quirky and require knowledge of C++ and how to use DirectX in that context. |
Beta Was this translation helpful? Give feedback.
-
You wouldn't be able to directly depend on a .NET 5 project from within a .NET Standard 2.0 project, but SharpDX or other .NET Standard 2.0 third party dependencies do not depend on bepuphysics2. Unless you require the older framework compatibility of .NET Standard for your project, there is not really any reason to use it.
The bepuphysics demos use SharpDX in .NET Core 2.1. I have private projects on Vulkan/DX12 running with .NET 5 previews. I'll admit I haven't paid any attention to UWP, since it's looking pretty dead-endy (at least with regard to a "UWP first/only" approach). As far as I know, .NET Core 3 brought with it desktop packs to make it easier to bring UWP/WPF/winforms/EF6/etc applications over. Using a windows-specific pack like WPF will make the application only run on windows, of course, but otherwise it works as expected. I'm not entirely sure if .NET Core 3 is deployable in the store or not- it may be. My understanding is that .NET 5, at least, should be deployable on the store directly, though I wouldn't recommend taking my word for it.
That's fine, you can keep using SharpDX if you'd like, it won't block anything. It really is easier than you might think though :) |
Beta Was this translation helpful? Give feedback.
-
The idea with .NET 5 is more or less that .NET Standard is going away. All of the platforms that .NET Core and Mono are supporting will come together in .NET 5, so your engine would probably target .NET 5 as well.
UWP is moving to .NET 5 (running under CoreCLR). Additionally, SharpDX should already work with UWP now, so it will continue to work with UWP in the future.
There are already bindings out there, e.g. https://github.com/terrafx/terrafx.interop.windows (which is also maintained by one of ClangSharp's maintainers). They match almost exactly to the underlying Windows APIs, and will continuously be updated with various .NET features like function pointers etc. |
Beta Was this translation helpful? Give feedback.
-
"The idea with .NET 5 is more or less that .NET Standard is going away" Can you please tell me where that has been said? Or is that just a rumor that community members have been putting forward? My engine would have to continue to target .Net Standard most likely and any libraries I use would have to be .Net Standard - that provides best compatibility. SharpDX would need to be updated most likely to support .Net 5. I've talked to the developer of SharpDX, even he doesn't know if SharpDX will be usable with .Net 5 because we don't know what it looks like yet, what it's requirements will be. "UWP is moving to .NET 5 (running under CoreCLR). Additionally, SharpDX should already work with UWP now, so it will continue to work with UWP in the future." - SharpDX works with .Net Standard now as well - that's what's important - are people actually writing UWP libraries? And we don't know what UWP is going to look like under .Net 5 - Won't it just be a container with all C# development moving to .Net 5 ? That would mean the current UWP programming model will go away. We will program in .Net 5 C# and build for the UWP or Win32 containers. I think it's way too early to be having this discussion. .Net Standard is currently the only way to write libraries that are compatible across frameworks. Both .Net Framework and UWP are still at the table and will be into next year. This is a discussion for 2021/22 not 8 months before the framework even exists. .Net Standard isn't going anywhere any time soon. |
Beta Was this translation helpful? Give feedback.
-
It's not going away in the sense that it's unsupported or no longer usable. There's just less use cases for new versions of .NET Standard after unification. As far as I've seen, the discussion around future .NET Standards is of the form 'not sure yet'. For example, from here: https://devblogs.microsoft.com/dotnet/introducing-net-5/
The idea is that all new applications can just target .NET 5 and it'll be fully supported on all the available recent runtimes. .NET Standard is only useful when trying to offer a common denominator across APIs of different capabilities, like .NET Framework and .NET Core. .NET Standard 2.0, for example, is already supported by recent version of both of those. .NET 5, critically, is a superset of .NET Core. It implements .NET Standard 2.0. I can flip the switch in the bepuphysics demos to netcoreapp5.0 right now with SharpDX as a .NET Standard 2.0 dependency and everything works perfectly. This is not a thought experiment- I actually did it. SharpDX can be used from a .NET 5 application. Now, would SharpDX be able to directly target .NET 5 without some changes? Dunno, bepuphysics2 can't yet because of the stuff I did with the reference dummies, but that doesn't matter for the consumer of a .NET Standard 2.0 SharpDX. There's no need to update SharpDX to directly target .NET 5.
It's true that .NET Standard 2.0 is required to have a single target across old frameworks. The purpose of this issue is to figure out how many users of bepuphysics2 are targeting those old frameworks, and if they foresee still being in that situation after .NET 5 releases. That's valuable information for me even if I'm not currently in the process of pushing bepuphysics2 over. |
Beta Was this translation helpful? Give feedback.
-
We are using Bepuphysics2 in Unity to simplify development (when the server physics processing is integrated into the build).
So the loss of .netstandart2.0 is dramatic for us. |
Beta Was this translation helpful? Give feedback.
-
Are you successfully running bepuphysics2 in unity with decent performance? That would be pretty interesting- last I heard (months ago) there were still some orders of magnitude slowdowns. |
Beta Was this translation helpful? Give feedback.
-
This seems like a timely article from the .NET Team: The Future of .NET Standard Still, if you want folks using .NET Framework, Unity, and UWP to be able to use the library into 2021, you should stick to targeting .NET Standard 2.0. There are many libraries/APIs available from 2.1 that are available in NuGet packages which work for 2.0 as well still. |
Beta Was this translation helpful? Give feedback.
-
@Sergio0694 would any of your new High Performance APIs be helpful here? As you already do a lot of the abstraction to multi-target both 2.0 and 2.1 for where 2.1 can be taken advantage of, right? That could allow the benefits of gaining some performance while maintaining compatibility, without the extra overhead on the bepu repo to worry about the multi-targeting aspects that goes along with that? |
Beta Was this translation helpful? Give feedback.
-
@hawkerm Yes, the @RossNordby My general suggestion here would probably be to do what Imagine you write a .NET Standard 2.0 library referencing |
Beta Was this translation helpful? Give feedback.
-
Unfortunately, the divergence in optimized implementation can be sufficiently huge between .NET Core 2.0/Framework and .NET 5 that I realistically can't/won't keep both at locally maximum performance and feature parity in the long term. Beyond API differences, there are improvements in the JIT that enable large scale structural simplifications- but that simpler implementation would cause older runtimes to choke badly. Making large new changes that maximize performance while keeping old framework/runtime compatibility means doing about three times more work, which in practice means I'll make fewer changes. That's already happened- I've avoided making certain big changes because they benefit so much from .NET 5 and creating an equivalent legacy path would be a huge pain. What I'd really like to know is how many users are actually using bepuphysics2 on one of these older targets, and how many foresee still being in that situation as .NET 5 and .NET 6 roll out. In particular, I'd be very interested in hearing from anyone who's managed to actually get bepuphysics2 working well with Unity (@nick-dodonov ;)) since the last reports I heard suggested that it was effectively unusable due to either compiler problems or >100x slowdowns. In other words, I'd like to identify the non-hypothetical users who would be troubled by a future transition before putting in hundreds of hours of extra work :P |
Beta Was this translation helpful? Give feedback.
-
My take on this... In the long run this is not an issue
So, I think the real issue here is what to do during the time it takes for everyone to get on the .NET core train? So, basically we are left with those who use .NET framework and rely, directly or indirectly via dependant libraries, on features/functionality which won't be ported to .NET 5. Without knowing the specific implementation details of bepuphysics2, but imagining that core issue codewise is to be able to utilize the framework and actual runtime platform (device, hardware, OS, ...) in the best way possible for maximum performance. Based on this I can think of these solutions which would support both .NET Framework, .NET Core (.NET5, ...) and the device specific implementations of .NET (Super sets of.NET with added API:s and optimized JIT for the specific devices).
If optimizing for performance, the DSL is probably the best approach. It could be implemented without any overhead, and could be implemented at different levels in the architecture, like, everything from adding two values together to more complex operations like transform a matrix or check if anything has collided with anything else. |
Beta Was this translation helpful? Give feedback.
-
not yet, the application is quite complex and i think it will take a long time before we move out of the .net framework, my guess would be to wait for .net 5 and slowly make some tests what will run and what will not, also we depend a lot in 3rd party libraries specific (afaik) to .net framework wpf implementation |
Beta Was this translation helpful? Give feedback.
-
"Practically all new 3:ed party libraries and packages for the ".NET Universe" are already targeting, or soon will be, targeting .NET Core." - that's not true at all. Practically all 3rd party libraries are targeting .net standard. .net core hasn't even had support for desktop until very recently. And .net core is not .net 5. Desktop developers won't really have to consider .net core at all. I would say most .Net Framework and UWP developers will go straight to .Net 5 after it is feature complete - from 2021/22 onward. .net core has never been a target for libraries (except maybe for those people that jumped into .net core). |
Beta Was this translation helpful? Give feedback.
-
Given continued progress in runtime unification and adoption, 2.4.0-beta5 now targets .NET 5.0. The intent is to have 2.3.0 be the last version targeting .NET Standard 2.0. While I cannot promise to change any major plans in response, I would still be interested in hearing from any users for whom this is a problem. |
Beta Was this translation helpful? Give feedback.
-
.NET Core 3.0 is now out, and it seems like a good time to collect information for the future of bepuphysics v2's dependencies.
At the moment, the library targets .NET Standard 2.0. It can run on .NET Framework, .NET Core, mono, and in theory any other runtime supporting the standard.
However, .NET Standard 2.0 is missing a few features which would be handy to have for performance and simplicity. 2.1 includes some of these features, but requires pretty recent platforms. .NET Framework 4.8, for example, will not support .NET Standard 2.1.
Further, even .NET Standard 2.1 is missing the platform specific intrinsics (at least as far as I can tell- let me know if I'm wrong about this!). The only full featured path appears to be .NET Core 3.0 (and later, the unified .NET 5).
So, the options:
Vector<T>
extensively, so the benefit of platform dependent intrinsics isn't enormous. But it's not zero either.) Notably, runtime bugs in Unity currently block full functionality, and while workarounds can make parts of it work, it tends to be so slow that it's mostly pointless.The obvious choice is to build for multiple targets, but this is a case where each additional target is enormous, and maintaining legacy paths would be painful. I'd like to see where people are using the engine to see if trimming down the support targets is possible.
Questions for users:
(It's worth noting that, even if the decision was made to, say, only focus on .NET Core 3.0, it would be a while before I'd actually work on that, let alone finish it. This is primarily a medium-long term concern.)
Beta Was this translation helpful? Give feedback.
All reactions