Bugzilla – Bug 830
Insufficient numerical precision of timestamped simulation time in the ascii trace file (ns-3-dev)
Last modified: 2010-05-16 18:25:14 UTC
Created attachment 776 [details] Script to recreate bug + patch Overview: The simulation time expressed in seconds and timestamped in the ascii trace file has insufficient numerical precision, and can cause erroneous results for further analysis especially for high-speed wired data networks. Steps to reproduce: 1) Edit the 'examples/tutorial/first.cc' script 2) Change the values of the PointToPoint attributes: --> "DataRate" to "1Gbps", and --> "Delay" to "1us" 3) Change the values of the UdpEchoClient attributes: --> "MaxPackets" to 100000 (let it run for a while) --> "Interval" to 0.001 --> "PacketSize" to 256 4) Set the Application stop time to for example 10 seconds 5) Enable ascii tracing by adding appropriate lines of code, e.g.: --> AsciiTraceHelper ascii; --> pointToPoint.EnableAsciiAll (ascii.CreateFileStream ("first_trace_file.tr")); 6) Run the simulation Note: already edited script can be found in the attachment Actual results: From the generated trace file we obtain: ... + 2 /NodeList/0/DeviceList/0 ... id 0 ... - 2 /NodeList/0/DeviceList/0 ... id 0 ... r 2 /NodeList/1/DeviceList/0 ... id 0 ... ... Notice that the times of enqueueing and reception are exactly the same and equal to 2 seconds. Expected results: We can easily find out that the reception time of the 256-bytes packet sent at time 2 seconds should be (approx.): Time of generation + Link propagation delay + Device transmission delay = 2 + 0.000001 + (256*8 bits / 1 Gbps) = 2.000003048 seconds Build date & platform: Build 2010-03-03 on Ubuntu 9.04 NOTE: Proposed fix is included in the attachment.
With the new tracing framework, a user can set the floating point precision of the underlying stream to whatever is required. AsciiTraceHelper helper; Ptr<OutputStreamWrapper> stream = helper.CreateFileStream ("file"); stream->GetStream ()->precision (8); Does that meet your needs?