C# – Requesting memory for your application

cmemory-managementsql-server-2005

I am having a similar issue to this person. The primary difference being the application is NOT meant for a developer environment, and therefore I need to know how to optimize the space used by Sql Server (possibly per machine based on specs).

I was intrigued by Ricardo C's answer, particularly the following:

Extracted fromt he SQL Server
documentation:

Maximum server memory (in MB)

Specifies the maximum amount of memory SQL Server can allocate when it
starts and while it runs. This
configuration option can be set to a
specific value if you know there are
multiple applications running at the
same time as SQL Server and you want
to guarantee that these applications
have sufficient memory to run. If
these other applications, such as Web
or e-mail servers, request memory only
as needed, then do not set the option,
because SQL Server will release memory
to them as needed. However,
applications often use whatever memory
is available when they start and do
not request more if needed
. If an
application that behaves in this
manner runs on the same computer at
the same time as SQL Server, set the
option to a value that guarantees that
the memory required by the application
is not allocated by SQL Server.

My question is: how does an application request memory from the OS when it needs it? Is this something built into compilation or something managed by the developer? The two primary apps running on this machine are Sql Server and the (fairly heavyweight) C# application I'm developing, and I'm almost certain we didn't specifically do anything in the realm of asking the OS for memory. Is there a correct/necessary way to do this?

Best Answer

Some applications allocate a lot of memory at startup, and then run their own memory management system on it. This can be good for applications that have particular allocation patterns, and that feel they can do a better job than the more generic memory manager provided by the runtime system.

Many games do this, since they often have a very good idea of how their memory usage pattern is going to look, and often are heavily optimized. The default/system allocator is general-purpose and not always fast enough. Doom did this, and is fairly well-known for it and of course its code is available and widely discussed.

In "managed" languages like C# I think this is very rare, and nothing you need to worry about.