iOpenShell » Technical questions » The estimation of CCSD(T) calcualtion disk and memory size?

The estimation of CCSD(T) calcualtion disk and memory size?

Moderators: kadir, krylov, piotr.

Page: 1

Author Post
Member
Registered: Sep 2007
Posts: 2
Is there any formula or equation to estimate CCSD(T) calculations size in Gaussian03 based on such as the basis function number and atoms number?
Member
Registered: Sep 2007
Posts: 11
There is a simple way to go about doing it. Run CCSD(T) using a smaller system (such as water) and note the time it takes. Add up the number of basis functions in your smaller system, and then add up the number of basis functions of your big system. Since CCSD(T) scales as the 7th power with respect to the number of basis functions, your estimated time for your actual system will be: ((# big sys. basis func./# small sys. basis func)^7)*time of small sys. If you want disk space, just replace the disk space used by the small system with the time and you're all set-although (stop me if I'm wrong) G03 uses Direct for everything, so disk space shouldn't be a problem. For some more examples, see: http://www.cmms.pitt.edu/~glen/index.php/Main_page/benzene_water/df-dft-sapt_bw#Estimated_Timings_for_10_and_8_Frequencies_for_Coronene_and_Supercoronene Here my computational method scaled as N^4.5, so just replace 4.5 with 7.
Administrator
Registered: Aug 2007
Posts: 200
Hi Glen! I don`t think disk space follows the scaling that you use for single points for correlated methods. Check the manual for the program you use, they should give you formulas for estimating disk space in terms of # of orbitals etc. Doing the same type of an estimate for disk size as for time probably works in your case by accident. For ccsd(t), disk space shouldn't scale as N^7, should be much better. I think it is usually something like N^4 or N^3. Cheers!
« Last edit by kadir on Fri Oct 05, 2007 12:15 am. »
Member
Registered: Sep 2007
Posts: 11
Disk space is a little harder to estimate than timing. Alot of it deals with how the program itself handles diskspace usage. Using the scaling scheme I use does provide a nice upper bound to how much disk you'll need. But since G03 is being used, I don't think that's of much concern as by default they use the Direct algorithm for everything.
Administrator
Registered: Aug 2007
Posts: 200
glen wrote
Disk space is a little harder to estimate than timing. Alot of it deals with how the program itself handles diskspace usage. Using the scaling scheme I use does provide a nice upper bound to how much disk you'll need. But since G03 is being used, I don't think that's of much concern as by default they use the Direct algorithm for everything.


That kind of an upper bound disk size estimate can be orders of magnitude off ;)
By the way, CCSD(T) is a very heavy I/O method. I do not think G03 has a direct algorithm for that. Forget about CCSD(T), I do not think that it even has a direct CCSD. I haven't checked many of the recent versions of the popular programs, but until recently, Molpro was the only one I knew, that had direct CCSD [and not CCSD(T)].
« Last edit by kadir on Mon Oct 08, 2007 7:28 am. »
imjames407
Guest
No idea man sorry.
Member
Registered: Aug 2008
Posts: 43
CFOUR doesn't have a fully direct method, but it's partial AO basis option does a good job at cutting down on the disk requirements.
Member
Registered: Aug 2008
Posts: 43
You can force G03 to recalculate various integrals instead of storing them by using the trans keyword. trans(iabc) became the new default with G09, but there are varying levels one can use to move more towards integral-direct mode.

Page: 1

iOpenShell » Technical questions » The estimation of CCSD(T) calcualtion disk and memory size?