78

Many high-level programming languages have built-in features to format a number with a system-dependent currency symbol:

-- Outputs $100.00 (en-US) or € 100,00 (de-AT)

Console.WriteLine(100.ToString("C")); // C# ? FormatCurrency(100) ' VBA ...

I've been developing business software for more than 20 years, and I've not yet found a single use case for this feature.

The thing is: $ 100 is a completely different amount of money than € 100. If I store 100 in a database field and just "format the value as currency", the user will get a different amount of money depending on their system setting.

Even if I always override the locale, that does not necessarily mean that the currency symbol will stay constant: de-AT (together with a lot of other locales) switched from ATS to about 20 years ago. In that case, the amount of money displayed would not only vary by locale but also by operating system patch level.

What am I missing? For which use case is this feature actually useful?

Heinzi
  • 9,868

7 Answers7

35

Is there a use-case for build-in currency formatting?

Basically, with currencies you have two ways of working:

  • in a currency-aware environment, where people register amount sometimes in local and sometimes in foreign currency: you will never use the default built-in feature. Instead you’ll store a currency amount and a currency code.
  • in a currency-neutral environment. Believe it or not, most private people and most small businesses around the world work only with one local currency, which happens to be the currency configured in their OS settings and never changes. Using the build-in formatting then takes advantage of this fact, and use OS configuration instead of forcing you to add your own configuration step in your software for this. By the way, this formatting has generally also the advantage of using the right decimal and thousand separators.

So yes, there is a use-case for this feature.

But with limitations

This being said, I’m not sure that this standard feature works well and out of the box and in a portable way with:

  • currencies that are expected to be displayed with a different number of decimals than the usual two (such as JPY which are usually shown with no decimals at all, or KWD which take 3 decimals) (if you know, please comment)
  • local financial usages of showing negative amounts either with a minus, or as a positive number between brackets
  • other practical usages, such as showing the currency symbol to the left (US, UK) or to the right (FR, DE) of the amount.

Although OSes may handle these rather well (e.g.: Windows, macOS), the OS independent programming language implementations are sometimes full of surprises and missing flexibility, which could limit the use-case of this feature, but for other reasons.

Glorfindel
  • 3,167
Christophe
  • 81,699
13

You are absolutely right, formatting using a system-dependent currency symbol is dangerous. I actually knew people who lost lots of money through that. Especially with US dollar and Euro being close enough that the numbers make sense.

On iOS you typically use a currency code, and the currency code is displayed in a system dependent way. For example, if the currency code is “Hong Kong Dollar”, that will be displayed as “Dollar” in Hong Kong and as “Hong Kong Dollar” everywhere else. Or take the currency symbol “Euro”, which may be displayed as € or Euro, before or after the number, depending on your system.

But just marking something as “Currency” is stupid and dangerous. I think Excel does that.

gnasher729
  • 49,096
11

The question seem to be "why programming languages, frameworks and operating systems support features that are not the best practices for professional developers in large multinational corporations"... and the answer is there are sometimes developers who are not in that category.

If your language only supports enterprise-level features you will not be able to get large adoption. This is indeed concern for languages either targeting "everyone" - like C#, Basic, Java, Python,...

Imagine a regular second day assignment "Alice had 10(units of your local currency) and payed 4 for candy, print how much she had left" - normally it requires knowledge of one basic numeric type (like int) and way to print with possibly default formatting. Now if we require proper currency handling:

  • one must understand difference between general purpose numeric types and monetary once (like float and decimal in C#)
  • must understand complex types to combine amount, currency, and potentially other property as rounding rules to represent "10 units of currency". Probably would need to understand difference between mutable/immutable types.
  • must have some understanding of globalization to know how to pick "local" currency format
  • indeed language will not support inline string literals as those are bad for localization, so some understanding of localization in given framework/language is needed too.
  • since the goal is to be really serious in proper support of localization/globalization good understanding of RTL/LTR languages and mixed strings - clearly such assignment solution must support all sorts of combinations - Hebrew string with Czech koruna for currency is a good start.

This would make writing reasonably correct code that handles money for individual person on they local machine too hard and make language not suitable for beginners and hobbyist.

One writing they first (and possibly last) piece of code to calculate savings growth based on percentage rate is unlikely to ever need that in multi-currency environment.

There are a lot of features that simply should not exist in a language/framework based on same reasoning:

  • string literals can't be used for anything but maybe filed names, should not be printable by itself. Only localizable strings should be allowed
  • Date/time must not exist by itself without explicitly specifying timezone information and rules to apply when timezone information changes
  • no naked numeric types (int, float). Every numeric value must have units of measure and directly or indirectly properties (like rounding rules, overflow rules, precision rules, compatibility with other types for all operations).
  • no "just random numbers generators" - every single one must specify distribution and randomness source at the very least.
  • all math operations must be super hardened. Integer division should only allow precise case - 2/3 should fail. Rounding must have explicit direction always specified. Operations loosing precision must define outcome explicitly inline in the code - even "+ 1" must specify that to avoid getting lost in case of large float numbers.
Alexei Levenkov
  • 392
  • 1
  • 10
8

There are a lot of long answers to a simple question here. You ask for a use-case and there's a simple one that I don't think has been mentioned yet: games.

If a game involving money is set in an ambiguous location, why not use the players local currency? It creates a more immersive experience and, as the feature is built into the language, costs very little to add to the game.

Kichi
  • 89
7

Thirty years ago or more it was probably still reasonable to assume that most computer systems that dealt with financial amounts, did so exclusively in the local currency.

In the English-speaking world and the advanced economies more generally whom computers were built to serve, the local currencies had never changed during the computer era, and many could be traced back centuries, so the idea of a local currency changing was also a fanciful future possibility.

Both Python and VBA can trace their language design back that far. The design of Excel (as @gnasher729 mentions in his answer) goes back even further.

The .NET platform which came together in the late 90s likely inherited that design perspective without further close consideration. It does however also have the option to accept a specific culture as a parameter to the string formatting, which need not be the local culture, so there was some consideration of the potential of handling multiple currencies.

Nowadays, currencies have become as vexed as datetimes, and the only sane option is to store the currency denomination with the amount, in the same way as storing the local timezone or location with the datetime. And god help the developer who has to deal with multiple currencies within the same system, because string formatting will be the least of his concerns.

So it seems to me that the answer to the question, is that the currency formatting functionality of those languages, which bases the format on the system settings, is simply an obsolete feature, and a legacy of decades-old design.

Steve
  • 12,325
  • 2
  • 19
  • 35
5

TL;DR

Currency formatting has been an OS-level configuration for decades now, and the pre-internet days were a very different beast in terms of the frequency of international transactions and the need for someone in region A to express money using region B's currency.

I suspect the OS-centric currency settings are a relic of the past, kept in either because it simply hasn't been re-evaluated yet, or specifically to provide some backwards compatibility for older tools.


This not not about formatting

While some existing answers provide information and food for thought, I'm also noticing a lack of distinction being made between the choice of currency versus the choice of currency formatting.

While the formatting of currency makes sense to be a local machine decision, the currency symbol itself (not its location - which is also formatting) doesn't quite make sense to be decided by the machine instead of the data source which provided the monetary amount which needs to then be formatted.

It makes no sense for someone to tell me "It costs 100" and for me to then go "oh I prefer that those be 100 yen then!".

I agree on all of the formatting arguments making sense as local machine decisions, but not on the choice of the currency symbol itself, specifically.


What's the benefit of having the OS decide the symbol?

However, currency formatting (and all other numerical formatting) has been an OS-level configuration for decades now, and the pre-internet days were a very different beast in terms of the frequency of international transactions and the need for someone in region A to express money using region B's currency.

There's only one scenario where having this be a local machine decision makes sense:

  • If you're developing software that you intend to sell in regions with a different currency
  • If your customer's ecosystem itself only ever works in its own chosen currency, without ever changing. This could be a single machine, single customer, or a company who operates within one specific currency region.

This is the only case I can think of where this setting is not a problem and actually adds something of value.

In such a case, developers of the software don't need to explicitly account for any future customer's possible currency, in case their software sells well internationally. They wouldn't need to adjust and re-release their code just because their tool is now also being sold in another country.

Instead, they can just represent money numerically and label it as "whatever currency you (the client) uses", and then can blindly trust that the customer's machine presents this currency the way they like to see it.

When you are a customer whose entire ecoysytem (and therefore all input/output of that software) is in a single currency, then monetary values really can be represented as "just a number" to you, since you never need to distinguish between different currencies.

As a basic example, if you need to divide 100 moneys between 5 people, then each person gets 20 moneys. This is correct regardless of what currency you work with, as long as all the values I just used are expressed in the same currency.
However, if one of those people needs to be paid in USD and the others in EUR, then your elegant mathematical and currency-free calculation goes out the window.

With the advent of the internet and international transactions, the whole "single currency ecosystem" idea went straight out the door.

I suspect the OS-centric currency settings are a relic of the past, kept in either because it simply hasn't been re-evaluated yet, or specifically to provide some backwards compatibility for older tools.

Flater
  • 58,824
0

Handling multiple currencies has been a problem since many centuries (and even millennia). Only the early computing solutions (initially based on US English assumptions) oversimplified the problem.

Multiple currencies will not end, new currencies are still being created (notably virtual currencies, or "community/social" currencies that are being created locally to counter the bad influence of world companies on the local economy).

No unification has been made as it is not realistic for many markets. And let's remember that now all businesses are made in a worldwide context. It makes no sense today to make any business (or even social interactions) with a single currenciy.

You may think that the ISO codes are enough, but this is false, many currencies used today are not encoded just because they are not traded openly on world markets (some local currencies forbid such trades to favor the development of local businesses and activities and more social/balanced terms of exchanges in terms of actual wror or service delivered and the time/efforts spent to resolve real problems).

ISO codes are good for these currently traded currencies in a multilingual worldwide context; they are still poor to modelize the social interactions. Consider each currency to have its own identity and own value (possibly independent and not convertible to other currencies ecept by specific/ad hoc private agreements between their users).

What is important then is to identify the users or communities accepting to trade them. As we identify organizations, there can be millions. A single 3-character code will not represent all organizations of the world. And each organization adopts its own terminology, conventions and languages they understand.

In summary, currencies are not bound to any system (no social interaction) and a system-level currency does not make any sense or to the "open" world market where they "may" be traded (not always the case).

So you need a clear concept where amounts identify both the sums/values, and the identity of the currency. Formatting it is a secondary aspect which is valid only within the scope of communities accepting to trade the currency openly and not just privately (private transactions occur much more often than transactions on open traded markets: you call this the "black market", and governments don't like it, but it has always existed). The only interest of "open" currencies is that governments and banks accept to trade them in a limited market, but they also offer a limited warranty of the value (we know that these warranties are very limited, and in fact they are decreasing: "open" currencies can be seized legally by governments or banks for their own interest without asking to communities; but it is the base of the fiscal systems that governments all depend on, but government should remember that they can do that only because they have a limited trust provided by their population which have only transferred a part of their freedom for making their own trading decisions without permission).

Not all currencies are convertible, as well not all of them are translatable. The formatting options vary also a lot (not all use a decimal system, not all use the same rounding conventions). Each currency identifies its own market and a set of people with their own cultures and conventions, and a set of private transactions between them, and their own rules to govern their emissions (and let's not forget that some "world currencies" are not even tradable openly, they are reserved for some institutions, like the DTC from the World Monetary Fund, and many assets reserved to traders on very risky markets: there are tons of funds, actions, obligations, warrants constantly created, and most of them are unregulated, and most escape the fiscal systems as they are not even located in a clear juridiction; many of them are also abandoned every day and fall to a null value for any other trading, exactly like organizations are being created and die).

All currencies have a limited lifetime, just like organizations, countries, and people in real life, and just like the trust that they give to each other, or revoke at incredible rate. It is then nonsense to speak about a "system currency" unit (introduced in the 1960's but already abandonned today or that should no longer be used as it was a bad design and bad representation of the real world).