clutchtamer
Proven Member
- 49
- 13
- Apr 29, 2015
-
Concrete,
Washington
A little background first, here’s something to think about- if you know an engine’s average steady state torque output over a defined rpm band, as well as it’s rate of full throttle acceleration (without external load) over that same rpm band, you can calculate the approximate torque-seconds required to accelerate that engine’s rotating mass across that rpm band. From that, you can calculate how the required torque changes when that acceleration period is compressed or extended (twice the torque required to do the same work in half the time). This gives you useful numbers to estimate the torque spike intensity created from rpm loss during either a launch until clutch lockup, or during a gear change.
I’m using general numbers here, hopefully to make this concept easier to understand….
…Lets say a generic 450ft/lb engine is tested for how fast it can accelerate WOT with the clutch pushed in, no load applied. Acceleration rate is found to be 8500rpm/second thru the heart of it’s torque band. At this rate, the engine gains roughly 2000rpm in .235 seconds.
…Now the car is launched and .235 seconds later, rpm has dropped 2000rpm as the clutch locks up. If it took 450ft/lbs to accelerate the engine’s inertia 2000rpm in .235 seconds, it also took 450ft/lbs to remove 2000rpm from that launch rpm in the same .235 second time period. Where did this 450ft/lbs of inertia energy discharged over .235 seconds go? Into the transmission’s input shaft along side the engine’s WOT 450ft/lbs, That’s a total of 900ft/lbs for a brief .235 second time period, then torque drops below 450ft/lbs as the engine starts gaining rpm and some of it’s output is siphoned off as energy being recharged as inertia back into the rotating assembly. To the driver, this change in overall torque output might feel like a bog.
From there you can play with the rate of the rpm loss and how that effects the torque that the input shaft will see. To remove 2000rpm from the rotating assembly over twice the period of time requires ½ the torque, so doubling duration to .47 seconds will cut that torque spike in ½ to 225ft/lbs. In this instance, the input shaft will see 675ft/lbs for .47 seconds before dropping below 450ftllbs. Speed up that 2000 rpm loss to just .118 seconds (typical of what you might see with a grabby clutch), the torque spike increases to 900ft/lbs. In that case, the input shaft would see 1350ft/lbs for .118 seconds before dropping below 450ft/lbs. This would feel like a huge bog, and be more likely to break a transmission. Change the ultimate limiting factor from breaking a transmission to exceeding the ultimate traction potential of the tire, reducing this torque spike is beneficial either way.
As you know real life power application isn’t binary like my example, but I hope this helps illustrate where I’m coming from.
Another way of putting it, which path do you think would result in a quicker car?...
...if you are launching from 6000 and rpm dips to 4000 as the clutch locks up, you launched with inertia energy in addition to the power that the engine was making. That added inertia energy was only temporary though, and has to be repaid as the engine begins gaining rpms. But the drivetrain had to have enough reserve capacity to handle that inertia surge, so you are only really using the drivetrain to it's full capacity for the first few tenths of a second (during the inertia discharge).
...add clutch slip. If you don't lose rpm after launch, there is no added inertia energy being dumped into the drivetrain. You are now launching on motor power alone, so little reason to launch any higher than your torque peak (~4500rpms). The real advantage comes when you realize that your drive train now has extra reserve capacity that you are not using! Now you can add more engine power without breaking the drive train. Unlike inertia energy that only lasts a few tenths of a second, added engine power will be there for the entire duration of the run.
Any added launch energy from a 6000+ launch comes from an inertia energy discharge, which results in rpm loss. If your rpm does not dip after launch, the stored energy of a 6000+ launch will net you little beyond what you could get launching from your torque peak (maybe 4500?). The lower rpm launch will result in a lot less clutch wear, as a 6000 launch packs 78% more inertia energy than one at 4500. This might seem counter intuitive, but you may see less wear/tear on the clutch. Despite slipping for a longer period of time…
…launching from a lower rpm without bogging = less total revolutions of slip
…most of the slip will now be with less than full PP pressure on the disc.
Later today when i get some time, i'll post up some pictures of how you can take advantage of this with a little simple fabrication.
I’m using general numbers here, hopefully to make this concept easier to understand….
…Lets say a generic 450ft/lb engine is tested for how fast it can accelerate WOT with the clutch pushed in, no load applied. Acceleration rate is found to be 8500rpm/second thru the heart of it’s torque band. At this rate, the engine gains roughly 2000rpm in .235 seconds.
…Now the car is launched and .235 seconds later, rpm has dropped 2000rpm as the clutch locks up. If it took 450ft/lbs to accelerate the engine’s inertia 2000rpm in .235 seconds, it also took 450ft/lbs to remove 2000rpm from that launch rpm in the same .235 second time period. Where did this 450ft/lbs of inertia energy discharged over .235 seconds go? Into the transmission’s input shaft along side the engine’s WOT 450ft/lbs, That’s a total of 900ft/lbs for a brief .235 second time period, then torque drops below 450ft/lbs as the engine starts gaining rpm and some of it’s output is siphoned off as energy being recharged as inertia back into the rotating assembly. To the driver, this change in overall torque output might feel like a bog.
From there you can play with the rate of the rpm loss and how that effects the torque that the input shaft will see. To remove 2000rpm from the rotating assembly over twice the period of time requires ½ the torque, so doubling duration to .47 seconds will cut that torque spike in ½ to 225ft/lbs. In this instance, the input shaft will see 675ft/lbs for .47 seconds before dropping below 450ftllbs. Speed up that 2000 rpm loss to just .118 seconds (typical of what you might see with a grabby clutch), the torque spike increases to 900ft/lbs. In that case, the input shaft would see 1350ft/lbs for .118 seconds before dropping below 450ft/lbs. This would feel like a huge bog, and be more likely to break a transmission. Change the ultimate limiting factor from breaking a transmission to exceeding the ultimate traction potential of the tire, reducing this torque spike is beneficial either way.
As you know real life power application isn’t binary like my example, but I hope this helps illustrate where I’m coming from.
Another way of putting it, which path do you think would result in a quicker car?...
...if you are launching from 6000 and rpm dips to 4000 as the clutch locks up, you launched with inertia energy in addition to the power that the engine was making. That added inertia energy was only temporary though, and has to be repaid as the engine begins gaining rpms. But the drivetrain had to have enough reserve capacity to handle that inertia surge, so you are only really using the drivetrain to it's full capacity for the first few tenths of a second (during the inertia discharge).
...add clutch slip. If you don't lose rpm after launch, there is no added inertia energy being dumped into the drivetrain. You are now launching on motor power alone, so little reason to launch any higher than your torque peak (~4500rpms). The real advantage comes when you realize that your drive train now has extra reserve capacity that you are not using! Now you can add more engine power without breaking the drive train. Unlike inertia energy that only lasts a few tenths of a second, added engine power will be there for the entire duration of the run.
Any added launch energy from a 6000+ launch comes from an inertia energy discharge, which results in rpm loss. If your rpm does not dip after launch, the stored energy of a 6000+ launch will net you little beyond what you could get launching from your torque peak (maybe 4500?). The lower rpm launch will result in a lot less clutch wear, as a 6000 launch packs 78% more inertia energy than one at 4500. This might seem counter intuitive, but you may see less wear/tear on the clutch. Despite slipping for a longer period of time…
…launching from a lower rpm without bogging = less total revolutions of slip
…most of the slip will now be with less than full PP pressure on the disc.
Later today when i get some time, i'll post up some pictures of how you can take advantage of this with a little simple fabrication.