It is, in fact, completely arbitrary. There is no reason why we should read 1+2*3 as 1 + (2*3) instead of (1 + 2) * 3 except that it is conventional and having a covention facilitates communication. No, it has nothing to do with set theory or mathematical foundations. It is literally just a notational convention, and not the only one that is still currently used.
Yeah I haven no idea what I was saying when I said that, I’ve edited my comment a bit.
On that note though using your example I think I can illistarte the point I was trying to make earlier.
1 + (2*3) by always doing multiplication first we can remove those brackets.
(1 + 2) * 3 can be rewritten as (1 * 3 )+ (2 * 3) so using the first rule again makes a sense. That is a crappy explaination but I think you get my gist.
If you don’t accept adding and subtracting numbers as allowed mathematical transactions, multiplication doesn’t make sense at all. It isn’t arbitrary. It’s fundamental basic accounting.
What you just said is at best irrelevant and at worst meaningless. No, the fact that multiplication is defined in terms of addition does not mean that it is required or natural to evaluate multiplication before addition when parsing a mathematical expression. The latter is a purely syntactic convention. It is arbitrary. It isn’t “accounting.”
It is, in fact, completely arbitrary. There is no reason why we should read 1+2*3 as 1 + (2*3) instead of (1 + 2) * 3 except that it is conventional and having a covention facilitates communication. No, it has nothing to do with set theory or mathematical foundations. It is literally just a notational convention, and not the only one that is still currently used.
Yeah I haven no idea what I was saying when I said that, I’ve edited my comment a bit.
On that note though using your example I think I can illistarte the point I was trying to make earlier.
1 + (2*3) by always doing multiplication first we can remove those brackets.
(1 + 2) * 3 can be rewritten as (1 * 3 )+ (2 * 3) so using the first rule again makes a sense. That is a crappy explaination but I think you get my gist.
If you don’t accept adding and subtracting numbers as allowed mathematical transactions, multiplication doesn’t make sense at all. It isn’t arbitrary. It’s fundamental basic accounting.
What you just said is at best irrelevant and at worst meaningless. No, the fact that multiplication is defined in terms of addition does not mean that it is required or natural to evaluate multiplication before addition when parsing a mathematical expression. The latter is a purely syntactic convention. It is arbitrary. It isn’t “accounting.”