Integer vs Int16  
Author Message
kyus94





PostPosted: Common Language Runtime, Integer vs Int16 Top

If value of variable is always going to be between 0 - 100 then will it make diiference to overall perfromance of application if we define that variable as integer instead of Int16. Assume that we have large no of such variables and they are used in highly transcational app where their value keep changing.

Since now a days ll Pcs come with 32 bit CPUs and 32 memory dose it make sense to use 16 bit datatypes. If it dose then how dose it improve performance

Thanks




.NET Development15  
 
 
Michael Koster





PostPosted: Common Language Runtime, Integer vs Int16 Top

Genereally it does not make sense performance wise to use 16 bit datatypes.
See this post: http://forums.microsoft.com/MSDN/ShowPost.aspx PostID=756832&SiteID=1

Michael



 
 
Greg Beech





PostPosted: Common Language Runtime, Integer vs Int16 Top

Also it depends whether they are used in public interface methods. I don't know if there is a formal design guideline, but Int32 (Integer) is the de-facto type for whole numbers in public interfaces, Int16 and Int64 are very rarely used, and even then only when there is a very good reason for doing so.



 
 
nobugz





PostPosted: Common Language Runtime, Integer vs Int16 Top

A 32-bit x86 CPU can natively handle 8-bit and 32-bit values but requires a special "prefix" opcode to handle 16-bit values. That costs an extra CPU cycle. You might get better performance if you have a large array with 16-bit values, it will be more likely that the array elements are in the L1 or L2 CPU cache.