Currently, Duration uses FiniteF64 for it's field. It would be nice to move away from FiniteF64 for the internal representation and use i64 or u32 and u64.
Below are some pseudo representations (not including DateDuration or TimeDuration):
struct Duration {
years: i64,
months: i64,
weeks: i64,
days: i64,
hours: i64,
minutes: i64,
seconds: i64,
milliseconds: i64,
microseconds: i64,
nanoseconds: i64,
}
struct Duration {
sign: Sign,
years: u32,
months: u32,
weeks: u32,
days: u64,
hours: u64,
minutes: u64,
seconds: u64,
milliseconds: u64,
microseconds: u64,
nanoseconds: u64,
}
General things that need to be considered.
- How will this interface with engines where inputs may be in a range of 2^53 - 1?
- Is there any point in the operations of the Temporal specification where precision on an unbalanced duration may matter?
- Should
DateDuration and TimeDuration be preserved in favor of using the above as the external Duration and a new "internal Duration" for calculation purposes?