Lots of "sort of on topic" points here for me:
A) If the charge puck says "PD-30US" but doesn't give anything to assume 30W but says, 5v=3A is just a 15W correct?
My opinion, but I wouldn't believe labeling that called a fixed-output 5V/3A supply Power Delivery based on PD standards. 5V/3A (which is 15W) is the basic USB charging spec.
B) I think 10 days to discharge is too much IMHO. I like the 5-day increment personally. I set up my Departments with the mindset of flying/updating is a better way to "discharge" so they always try to "Use" the charge in some way instead of just letting the internals "short out" the energy. Seems like a waste of energy, a charge, and life to me.
In your situation where the drones are being used constantly, or more regularly than many hobbiests, that makes a lot of sense.
OTOH a battery that spends the majority of its life stored, I don't think 5 vs. 10 days until auto-discharging makes any meaningful difference.
C) I always prefer a "slow/gentle" charge to be easier on the cells but sometimes we "need" to rush things. Older technology didn't really "rush" very well at all and more often than not you would either ruin the battery or create an ignition... Bad very very bad! This is why we charged in Fire Proof scenarios just-in-case.
Smart
Don't miss out, though, on how much li-ion batteries have advanced. The cells that DJI (and Tesla, Samsung, etc.) are using really are far more tolerant of up to 1C charging rates. It's pretty safe these days, from both a charging catastrophe, as well as damage to cell longevity, to charge at 1C.
Beware, however, that this improvement has also driven the availability of cheap 18650's based on older technology that don't like >0.1C during the Constant Current phase of the charge cycle.
D) In terms of "Charger Power" are these
Mini 3/4 batteries able to "take" more current? Previous DJI batteries had internal circuitry that limited/controlled the charging. Regardless (up to a certain point) how strong the charger was the battery would regulate the current going in and charge rate. Does anyone here have actual DATA to demonstrate increasing the Watts will indeed provide a faster charge on these batteries? I'm not talking about comparing a low watt wall wart to the High Watt units but what about going from a 15W to a 30W or 45W? How much
I'm going order one of the Parallel chargers as I've had GREAT success with them for several years on every single aircraft I've owned. The ability to charge 4 batteries in almost the same amount of time as a single is a HUGE plus when time is critical . . .
Remember that a power supply can not force an arbitrary current into a load at a particular voltage. The supply (or charger) presents a voltage to the load, and the load draws whatever current it does at that voltage. Controlling current requires a variable voltage.
So a 100W power supply simply has the capacity to deliver 100W to the load. It can't force a load to draw 100W.
PD was a necessary technical advance to deliver higher power to USB connected devices. The legacy USB spec is 5V and only 5V, so the amount of power that can be delivered over various wire gauges, lengths, etc. – USB cables – is limited by resistive losses, which scale as the current squared. So, double the current and the cable loses 4x as much power through heating than at half the current.
So at 5V, to deliver 100W to a load the cables would have to carry 20A. It would get very hot, and maybe even melt the insulation. OR, it would have to be made of such thick gauge wire so as to be impracticably inflexible and useless.
Enter PD. 100W is delivered to the load by negotiating a voltage that can send that power without frying the cable (and PCB traces, and mosfets, etc.) down at the load. In this case, the load and supply would agree on 20V, and the load would draw 5A, pretty tame.
So why wasn't USB 20V to start? Or 100V?
Safety. Electricity gets more dangerous the higher the voltage. Plugging a USB connector can spark at 20V where it doesn't at 5V. And other safety concerns.