This cannot be answered convincingly in the abstract because it depends entirely on the type of object.
- One can easily envisage an object that (for example), has to establish a remote connection when created new; and that involves doing a DNS lookup, handshaking and authentication. Whereas if you reset the object it might retain the existing connection, and just reset its current internal state.
In this case, re-using might save substantial time.
- On the other hand, you can also envisage an object that has substantial (perhaps hierarchal) internal state, that resetting might take a substantial amount of time going through setting values back to the defaults. Whereas simply discarding the whole object and creating a new one would be quicker.
The bottom line is: it depends. And your best way to find out is to just time it both ways using the actual class you're concerned about.
With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
In the absence of evidence, opinion is indistinguishable from prejudice.
| [reply] |
Thank you, BrowserUk.
So this question depends on how much overhead or cost to create an object.
My benchmark shows 'reusing obj' is more efficient than 'creating new one' no matter complicated obj or simple obj, which is reasonable to me.
So for general, can we claim it is 'the best practice' to always reuse object instead of to create new one?
| [reply] |
I would never say that the general best practice is to reuse an object rather than create a new one. The best practice may be entirely the opposite. In the general case, NOT tinkering with an object's guts is the best practice. In the specific case where time efficiency trumps all other concerns, and the object is cheaper to reuse than to regenerate, then it may be a useful practice. That's about the strongest assertion you could make.
If it is useful in your case to reuse an object, it may be a good practice to create a subclass of the object that provides a "reset" method, or that allows "new" to be called on the object, with "reset" semantics, although that might confuse people who expect $obj->new to create a clone. The point being, such a subclass would allow the calling code to avoid touching the object's internals, which is a better practice.
| [reply] |
| [reply] |
Maybe you know it but be sure to check the performance in a really good context. Sometimes a same bench (instructions set) gives strange results because the CPU, RAM are used by other process or the network is slow. Checking on a server can be weird! Limit the bench to the necessary instructions. Be sure also that the bench runs long enough.
| [reply] |
Is this to address a real measured performance issue, or something you've been pondering as a abstract thought exercise?
For almost all practical code the performance difference is likely to be tiny compared to the rest of the work performed by the code. The biggest win is writing the code to be easily maintained in the first instance and worrying about run time performance when it becomes an issue.
Premature optimization is the root of all job security
| [reply] |