Consider the following input in a financial application, where precision matters:
{ "value": 3.8 }
And the following AWS Lambda function:
from decimal import Decimal
def lambda_handler(event, context):
value = event['value']
print(Decimal(value))
The output is: 3.79999999999999982236431605997495353221893310546875 because Python parsed the number in the JSON into a float, which can't precisely store 3.8.
I know that I can serialize event back to a string and then instruct the parser to use Decimal (this is from the DynamoDB Python docs):
import json
def lambda_handler(event, context):
parsed = json.loads(json.dumps(event), parse_float=Decimal)
print(Decimal(parsed['value']))
But that feels like a hack. Is there some way to control the deserialization in the first place so that event prefers Decimal to float?
Decimal(3.8)*Decimal(10)returns 37.99999999999999822364316060. No bueno in a financial application.380 * 10 = 3800json.loads(json.dumps(event), parse_float=Decimal)in a function so that the code looks at a bit better. Maybe you can change but you will have to look where the event object is created. Maybe Sympy can help you. Good luck.