function getMonthName(monthNumber) {
const date = new Date();
date.setMonth(monthNumber - 1);
return date.toLocaleString([], { month: 'long' });
}
function getMonthName(monthNumber) {
const date = new Date();
date.setMonth(monthNumber - 1);
return date.toLocaleString([], { month: 'long' });
}
The amount of people arguing that this is a fine behavior in this thread makes the whole thing even funnier.
What would you expect “-1 month” to do for a date like 31st of March? Would the result be the same as for “-1 month” on 29th of March?
If you go back 2 months so the 31st is existing again - should that mean that the result of using -1 month twice should be different to using -2 months?
I think it’s just a stupid way to implement something like this as “month” isn’t a defined size so defining it with a fixed value and documenting it properly is a decent solution but noone should use that kind of function in the first place
This is literally how every sane API works in languages built by adults. For example, here’s what happens in Java:
java.time.LocalDate.of(2023, 3, 31) > #object[java.time.LocalDate 0x2bc77260 "2023-03-31"] java.time.LocalDate.of(2023, 3, 31).minusMonths(1) > #object[java.time.LocalDate 0xac0dc15 "2023-02-28"] java.time.LocalDate.of(2023, 3, 31).minusMonths(2) > #object[java.time.LocalDate 0x44b9305f "2023-01-31"]
I have no idea where people get this notion that a month isn’t a defined size. Do people just not understand the concept of a month?
I would expect the month to increment by one and the day to be clamped to the valid days for the month.
That’s precisely what I’d expect as well, and what APIs in languages like Java do.