-
Number-cast input string vs. numbers - inconsistent traces between computers?!
We're making a simple input text comparisons vs. some constant floating point numbers (e.g. 4.35). Using Actionscript 2 to Flash 9 export. Getting totally contradictory traces on two different computers compiling identical code...
Here's some simple code we're using to figure out what's going on.
Code:
trace(typeof(Number(435/100)))
trace(typeof(Number(435/100)==4.35))
trace((Number(435/100)==4.35))
trace(435/100==4.35)
trace(0.080==0.08)
trace(Number(input_txt.text)==(435/100))
trace(Number(input_txt.text)==4.35000)
In the order listed above, I get the following trace when I export via Flash CS3 (9.0) on my PPC mac:
number
boolean
true
true
true
true
true
(this matches our expectations)
**The same exact FLA file, unmodified, gives the following trace output on 2 different intel macs (one mac pro, one imac) using Flash CS3 (9.0).
number
boolean
false
false
true
true
false
Does anyone have a clue why this is giving falses for statements such as "(Number(435/100)==4.35))", "(435/100==4.35)" or "(Number(input_txt.text)==4.35000)" but ISN'T giving a false for (Number(input_txt.text)==(435/100)) (NOTE: input_txt.text is an input field which contains the string 4.35)
Also, what behavior do any PC users have? I'm expecting the latter since they're generally intel chipsets too??
Note: once i compile the code on my PPC mac, intel macs that run the SWF (but don't recompile the FLA) do return the correct booleans that match the PPC output
-
Tags for this Thread
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
Click Here to Expand Forum to Full Width
|