I know this kind of question has been asked several times, but I seem to have in issue in my understating of them. My understanding of the terms is:
- Duration: time taken for query to fully execute on server (not affected by things like network latency/speed)
- Fetch: time taken for data to fully transfer to client (Affected by things like network latency/speed)
This does not appear to be the case in 5.2.47
For a query of select * from tbl where id =x (where id is indexed )
Over the internet Duration is between 0.015 and 0.032, and fetch is 0
Over a local network both duration and fetch are 0.
Is my understanding wrong or is this a bug?
Worbench V5.2.47
OS: Win 7 SP1 Enterprise 64 bit
Mysql server: 5.5.22
Area: Query editor – output window
- Duration: time taken for query to fully execute on server (not affected by things like network latency/speed)
- Fetch: time taken for data to fully transfer to client (Affected by things like network latency/speed)
This does not appear to be the case in 5.2.47
For a query of select * from tbl where id =x (where id is indexed )
Over the internet Duration is between 0.015 and 0.032, and fetch is 0
Over a local network both duration and fetch are 0.
Is my understanding wrong or is this a bug?
Worbench V5.2.47
OS: Win 7 SP1 Enterprise 64 bit
Mysql server: 5.5.22
Area: Query editor – output window