Jump to content

What is the largest number a data type can accurately store?


Recommended Posts

Okay... I know that Decimal is accurate up to 29 digits and that Single and Double can store even larger numbers... what i want to know is: how much accuracy is lost using Single and Double? I think, because they use floating point, S and D become quite inaccurate *scratch* I'm not too sure...

if anyone could help it would be a huge help

thanks

Link to comment
Share on other sites


Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...