Viewing a single comment thread. View all comments

Scuka1 t1_ja1v0b8 wrote

Ah.

Meh. Only applicable if you have a weekly salary.

All the bills and subscriptions are monthly though, so I don't see much point in a weekly salary either, except 4.345 more paperwork.

For life expenses (fuel, food), the 4.345 thing doesn't make sense either. You get a more accurate result if you actually average out the entire month, instead of averaging out a week and multiplying by 4.345. If you average out, say, 3 months and then divide by 3, you get an even more accurate monthly average.

The longer the time period, the more accurately your average represents the actual situation because outliers get ironed out.

Week is too short for capturing general trends.

3

Burstar1 t1_ja2a89r wrote

>You get a more accurate result if you actually average out the entire month

Not necessarily. Months very in their length if accuracy is your argument so what you're really saying is convert to annual and compare that way. This... is an option ofc.

In practice many don't have a choice BUT to be paid weekly or bi-weekly and are living paycheck to weekly paycheck (and consequently need to budget the most). Additionally, a lot of expenses are budgeted weekly despite longer terms being better for the math. The individual may know they can eat out once a weekend and also want to know how much they can afford to spend on entertainment that week. Car loan amortized weekly?

;tldr this is how you can do it if you need to but, shocker, there's more than one way to do math if you don't.

1

Scuka1 t1_ja2bgas wrote

>there's more than one way to do math

There are ways that give you more and less accurate results.

If you take a week worth of data and multiply to get a month, you're also multiplying any mistakes or outliers there might be. If you take 3 months worth of data and dividing to get a picture of your average month, you're dividing, i.e. ironing out any outliers. That gives you better predictive power to predict, say, how much money you're going to need in the next 10 months.

Say you eat at home almost every day of the month, but once per month you eat at a fancy restaurant. If you take data from the week you ate at that fancy restaurant and multiply by 4.345, you're going to make it seem like you eat at a restaurant 4 times per month. If you take data from the non fancy restaurant week, your fancy meal won't be captured in the data at all. Either way, you're getting inaccurate monthly data.

Longer time frame = more accurate result in terms of predictive power (e.g. predicting how much money you're going to need over the, say, next 10 months)

−1