C# – Why decimal in c# is different from other c# types

c++

I was told that decimal is implemented as user defined type and other c# types like int have specific opcodes devoted to them. What's the reasoning behind this?

Best Solution

decimal isn't alone here; DateTime, TimeSpan, Guid, etc are also custom types. I guess the main reason is that they don't map to CPU primatives. float (IEEE 754), int, etc are pretty ubiquitous here, but decimal is bespoke to .NET.

This only really causes a problem if you want to talk to the operators directly via reflection (since they don't exist for int etc). I can't think of any other scenarios where you'd notice the difference.

(actually, there are still structs etc to represent the others - they are just lacking most of what you might expect to be in them, such as operators)