All convolutions within a dense block are ReLU-activated and use batch normalization. Channel-smart concatenation is only feasible if the height and width Proportions of the data stay unchanged, so convolutions in a very dense block are all of stride 1. Pooling levels are inserted between dense blocks for even https://financefeeds.com/tradewebs-money-markets-adv-nearly-doubled-year-over-year/