Forage Inventories: Should I Take A Fall Harvest?

hay-piled-onto-a-trailer

As summer starts winding down you should have a fairly good idea of forage inventories. Any supplies of corn silage should be approaching their annual minimums.

As summer starts winding down you should have a fairly good idea of forage inventories.

Any supplies of corn silage should be approaching their annual minimums. Not that you should be almost out, since ideally you should have enough to continue to feed “old crop” corn silage until the end of the calendar year or perhaps a bit longer. But soon the remainder of the 2016 crop will be supplemented with fresh-chopped corn, and by about the end of September your corn silage inventory should be at its annual peak.

By now you should know where the new crop corn will be stored – and not on top of (if you have tower silos) or in front of (bunker silos) last year’s corn silage. You don’t want to bury well-fermented silage – you want to feed it!

Some farmers put up one or more bags of corn silage to feed between corn harvest and when the new crop is fully fermented, a process that usually takes several months. Another way to accomplish this, particularly on larger dairies, is to have enough “old crop” corn silage in a drive-over pile for three or four months of feed.

Farmers have long known that cows often drop in milk if they’re fed unfermented (recently chopped) corn forage. Part of the problem is that rumen bacteria take time to adjust to what is a very different feedstuff with a much higher pH.

But it’s not all due to what my old friend Dr. Charlie Sniffen calls “rumen buggies”: We now know that starch digestibility increases considerably with storage time, not reaching its peak until after several months of fermentation.

Your 2017 corn crop: Harvest for silage, grain or …?

There are readily available silo capacity tables, online and elsewhere, that will approximate how much silage you have in each of your storages. By now you should have some idea of the yield of your corn crop – good, bad or about average. Using this information – what you have in inventory and approximate production – should result in knowing about how many acres of corn you’ll need to meet the needs of your herd. (If you won’t have enough 2016 crop to feed through the end of 2017, or if you have a big crop be sure to put up enough extra this year so that you won’t be in the same situation a year from now.)

If you won’t have enough, now is the time to start making plans to purchase some at harvest. If you know that you’ll have more than enough, you may want to harvest some for grain (using your own equipment or by having it custom combined). Another alternative is to sell surplus corn either by the acre or by the ton. My preference would be to sell it by the ton since the seller often winds up with more income and there’s less chance for a disappointed buyer if the crop yields less than anticipated. When corn grain prices are low you can expect more farmers to be willing to sell their surplus as whole-plant silage.

Is this trip really necessary?

A decision farmers growing alfalfa and alfalfa grass need to make each fall is whether they should take a fall harvest. Following is a brief true-and-false “quiz” on this decision:

Alfalfa repeatedly harvested at the bud stage enters the fall in a weakened state because it didn’t have enough time between harvests to adequately recover root reserves.

True. Root carbohydrates aren’t fully replenished until the alfalfa plants is in bloom – something that often doesn’t happen for the entire summer. In fact, modern alfalfa harvest management may result in the farmer never seeing an alfalfa blossom from seeding until the field is rotated out of alfalfa.

Taking a fall harvest of alfalfa following a summer of bud-stage harvest will usually result in a lower first-cut yield the following spring.

True. This has been proven by research in the upper Midwest. It’s not easy to fool Mother Nature.

Leaving a big yield of alfalfa in the fall can result in it lodging (falling down) and smothering the alfalfa.

False. I cannot remember the last time I saw a “big” yield of fall-grown alfalfa. The crop may look like it will yield well, but those large leaflets can be deceiving. What usually happens is that frost kills the top growth and the leaves fall to the ground. There aren’t nearly enough leaves on fall-grown alfalfa to smother a healthy alfalfa plant. In some winters, especially when there isn’t a heavy snow cover, many alfalfa stems – now leafless – remain standing all winter.

If an alfalfa field isn’t harvested in the fall the farmer, may have to deal with the dead alfalfa stubble in the first cut the following year.

Both true and false. While it’s true that an open winter can seem to leave a lot of bleached-out alfalfa stems, research has shown that the impact of these stems on the quality of the following year’s first cut is quite small – about 1 percent higher neutral detergent fiber than if there weren’t any old stems in the forage. The only time I’d be concerned about the presence of a small number of old stems in the new crop is if the alfalfa is baled for dry hay and then sold to horse owners, some of whom are notoriously picky about the forage they buy. Not that those old stems would hurt a horse, but appearance is important.

My recommendation regarding fall harvest of alfalfa and alfalfa grass: Take a fall harvest only if you need the forage.

Fall-harvested alfalfa is certainly not my favorite forage, and I’ve heard other farmers say that fall-harvested alfalfa silage doesn’t feed as well as the forage analyses suggest it should. Expect that silage fermentation has something to do with it. Frost and even an extended spell of cold weather can deplete the naturally occurring populations of fermentation bacteria.

It’s often cold when we ensile this forage. It’s also cold when we harvest and ensile the last of our corn crop, but corn contains a lot of plant sugars, which are the food of fermentation bacteria. Finally, fall-harvested alfalfa is usually high in crude protein – almost always over 20 percent crude protein (CP) and often 25 percent or higher. By comparison, corn silage is about 8 percent CP. The higher the crude protein in a crop, the harder it is to ferment.

So, if you don’t need the forage, park your mower-conditioner by about Labor Day and let your alfalfa enjoy an uninterrupted fall of growth to replenish root carbohydrates.

The Importance of Proper Soil Sampling

soil-sampling

Trying to grow crops without knowing the soil pH or fertility levels of each crop field is like driving a car with a broken speedometer: Sooner or later you’re going to get in trouble.

Trying to grow crops without knowing the soil pH or fertility levels of each crop field is like driving a car with a broken speedometer: Sooner or later you’re going to get in trouble. Most farmers realize this and rely on soil analysis, with the sampling done either by someone on the farm or by its crop consulting firm.

Crop consultants are more involved in soil sampling these days, especially since one of the requirements of the nutrient management plan that’s part of concentrated animal feeding operation (CAFO) law is that all crop fields must be sampled at least once every three years. (Check with your consultant or state authorities on the required frequency of sampling.) Most agronomists recommend sampling cropland every two or three years. Annual sampling probably won’t provide much more useful information so it isn’t needed unless something dramatic is happening in a field – for instance, a nutrient-related problem or unusually heavy manure application.

Crop consultants are aware of the importance of proper sampling. However, too often a farmer assigns the job of soil sampling to an employee. Sometimes the “low man on the totem pole.” This can be a problem, especially if the employee isn’t informed on proper sampling technique.

One in a million?

Most soil test labs request a pint of soil for each field sampled. This is about 1 pound of soil. The soil in a 6- to 7-inch plow layer weighs about 2 million pounds for each acre. Therefore, in a 10-acre field a properly taken soil sample is an attempt for the 1 pound of soil submitted to the soil test lab to accurately represent the soil fertility in 20 million pounds of soil!

That’s why it’s recommended that a soil sample shouldn’t represent more than 20 acres of cropland. If you have fields larger than 20 acres, divide the field in some logical way – perhaps by soil type if the field contains at least two soil types. In doing so, think about what you’d do if the two (or more) samples turn out to be considerably different in pH or fertility. Differences in pH can be managed rather easily by instructing the driver of the lime truck as to what portion of the field he should spread and at what rate. If the differences are in soil fertility you may need to selectively broadcast fertilizer – even if you normally rely on planter-applied fertilizer to do the job.

Soil sampling is serious stuff

If someone on the farm does the soil sampling, consider the following: First, the person must be reliable. When he or she is out “in the back forty” taking samples, you won’t know if the person is taking the recommended 15 to 20 cores (subsamples) per field of 20 acres or less, or simply walking into the field and taking it all from one or two spots. The person should be told how to properly take samples, and that this is a job that could have a significant impact on fertilizer rates, crop yield and quality. During the many years I was agronomist at Miner Institute, soil sampling was done by our field crops supervisor or an experienced, trusted employee. After the farm became a CAFO we employed a crop consulting firm that would do the sampling if asked, but this was one job we continued to do ourselves.

Getting the job done right

Assuming you’ve decided to do the soil sampling yourself or have a trusted farm employee do it, following are some guidelines: Sample when row crops such as corn are not growing in the field. My preference is to sample in the late summer or fall, but if you always sample at the same time of the year, then spring is okay. Changes such as nutrient availability, particularly potassium, are seasonal in nature. So stick with your choice of soil sample fields:

  • Depending on how many fields you have on the farm you may decide on sampling one-third of the fields each year. If CAFO regulations require sampling at least every three years this will meet requirements while not resulting in too much sampling effort each year. Try to sample fields to be seeded to alfalfa (or any crop requiring a high soil pH) at least 6 months ahead of seeding, especially if there’s a chance that a high rate of agricultural limestone will be needed.
  • Avoid sampling fields where lime or manure was recently applied. Brush aside any crop residue and other debris from the soil surface. Using a soil test auger or soil tube (can be purchased from Nasco and other farm supply companies) takes cores to a depth of 6 to 8 inches. If there’s an obvious change in soil type at a slightly lesser depth you might want to sample only to this depth. For long-term no-till or minimum till fields it’s also a good idea to take a separate soil sample from the top 2 to 4 inches, primarily to check on soil pH. Repeated applications of nitrogen fertilizer can acidify the top couple inches of soil – sometimes called an “acid roof.”
  • Put the individual cores in a plastic pail, and when you have the necessary 15 to 20 cores, break them up and thoroughly mix the now-granulated soil. If the soil has a high clay content and is too wet to mix properly wait for it to dry but don’t heat the sample. Remove anything that’s not soil and submit a pint of soil (or whatever amount the soil test lab requires). Be accurate when filling out the soil sample information sheet, and fill it out completely: The more information you provide to the soil test lab, the more accurate the fertilizer recommendations. Also, it’s important to choose a reputable analytical lab and use it every year. This avoids the confusion resulting from the various soil extractants used by different soil test labs.

Photo Courtesy of creativesunday2016 /istock


Why It’s Important to Understand Forage Problems

early-summer-forage-musings-image

It’s probably too late to do anything about any problems you may find with your forage, but knowing what happened – and why – is essential in trying to prevent the same things from happening again.

It’s probably too late to do anything about any problems you may find with your forage, but knowing what happened – and why – is essential in trying to prevent the same things from happening again. To quote Albert Einstein, insanity is doing the same thing over and over again and expecting different results.

Some readers may wonder why corn is mentioned in an article with “forage” in the title. I consider corn to be two crops, each with different characteristics: The kernels are a high-quality grain while the rest of the plant (the stover) is a modest quality warm-season grass. When the corn plant is harvested for grain the remaining plant is low in quality, similar to a forage grass harvested well after heading. Occasionally corn stover is baled and fed to wintering beef cows or, where there’s a serious shortage of quality forages, fed in modest quantities to dairy cattle. Several years ago I was on a crop consulting job in the Texas Panhandle during a severe drought. On some of the dairies corn stover was one of the major forages in the ration, with huge stacks of large square bales waiting to be fed. When fed to lactating dairy cows, corn stover must be supplemented with grain or higher-quality feedstuffs.

The quality of the corn plant as a forage is determined by the relative proportions of grain and stover. The higher the grain content, the higher the digestibility of the entire plant. That’s because corn grain doesn’t contain much fiber and only about 1 percent lignin, which is indigestible. I seldom include discussions about grain corn in these columns – national farm publications are chock-full of articles about this crop – but I often discuss corn silage because it’s a true forage crop. A dictionary definition of forage is “plant material that livestock graze or that is cut and fed to them.” Corn harvested for whole-plant silage perfectly fits that definition.

Corn emergence

The combination of improved seed corn production practices and better planting equipment has resulted in a slightly higher germination rate and plant population compared with the corn we planted a generation or more ago. Seed treatments (typically a combination of an insecticide and one or more fungicides) are now applied by the seed company instead of by the farmer. Not only is this safer for the farmer but there’s better adhesion of the treatment to the seed, increasing the percentage of kernels that survive to produce a plant. This, combined with higher germination rates, has resulted in a change in the recommended overplanting rate for seed corn. We used to recommend overplanting by 15 percent, so if you wanted 30,000 plants per acre at harvest you’d need to plant almost 35,000 seeds. Now, some seed company agronomists recommend overplanting by only 5 percent when field conditions are good to excellent, and by 10 percent with very early planting or under cloddy soil conditions. This means having to buy 5 percent to 10 percent less seed corn. It’s worth paying attention when a seed company recommends reducing the amount of seed that farmers would need to buy.

If you have a poor stand of corn it’s almost never because of a bad lot of seed. The exception would be corn that’s been stored on the farm for at least a year, especially if it’s been stored in a shed that gets hot during the summer. Heat is the enemy of farm-stored seeds. Unless storage conditions have been poor, seldom is year-old seed lower enough in germination to affect planting rate, but any seed older than that should be germination tested, either on-farm or by sending a sample to an agricultural testing laboratory.

Germination tests are inexpensive, well worth doing if there’s any question about viability: Test, don’t guess! Many years ago at Miner Institute we “inherited” a few bags of seed corn that were several years old when we assumed management of the dairy operation from Cornell University. Before planting it we had the seed tested and were pleased – and a bit surprised – when the germination was about 95 percent. The corn grew just fine in the field.

Planting depth

Two factors resulting in poor emergence are corn that’s planted either too shallow or too deep. A planting depth of 2 inches is ideal for most situations, but with very early planting into cold soils, planting depth can be reduced to 1-3/4 inches. This leaves a small margin for error since corn planted 1-1/2 inches deep should still germinate OK. Problems start to occur when planting depth is reduced to 1 inch, usually unintentionally. Seedbeds that are loose and fluffy – often the result of too much spring tillage – can settle by a 1/2 inch or so following a heavy rain, thus decreasing the planting depth. When the corn kernel is 1 inch or shallower, the nodal roots will be barely below the soil surface. Several problems can result from shallow planting including root damage from some preemergence herbicides, which is why it’s often called “rootless corn syndrome.” On the other end of the planting depth spectrum, corn can be planted to a 3-inch depth after the soil has warmed with little negative impact on emergence. However, with early planting – prior to mid-May in much of FARMING magazine’s area of coverage – planting corn too deep can result in delayed germination because of cold temperatures at that depth, leaving the kernels more susceptible to seed rots.

If faced with a field of corn with a poor stand, carefully dig up some emerged plants to determine how deep the corn was planted. Try to find some kernels that didn’t germinate to determine the condition of the kernel. Do this when the healthy plants are no more than about 6 inches tall. If you don’t find a kernel is it because something ate it? Seed corn maggots are a potential cause. It may also be because some of the seed wasn’t planted. It happens, though much less frequently with modern corn planters. It’s important to diagnose the reason for a thin, uneven stand of corn: Don’t hesitate to ask for help from your Extension educator or crop consultant.

Legume-grass seeding problems

As with corn, a major source of failures and inadequate stands in legume and grass seedings also relates to planting depth, either too shallow or too deep. Planting small seeds such as alfalfa and forage grasses too deep is more commonly the cause of poor stands. Although the recommendation varies with soil type, soil moisture and time of seeding, most legume and grass seeds should be planted no more than 3/8-inch deep. A good rule of thumb is that 5 to 10 percent of the seeds should be visible on the soil surface after seeding. By the time you notice problems with a seeding, however, birds and insects may have carried off any seeds that remained on the soil surface.

Herbicide damage is another potential source of injury to seedlings. This can be due to herbicide residues from the previous crop – often corn – or postemergence herbicides applied when the forages were in the seedling stage. If necessary, check with your state regulatory authorities for the herbicides that are approved for the species you’re planting. (With the exception of clear alfalfa the list will probably be short.)

Not everything is known about how the various forage grass species react to the labeled herbicides, so proceed with caution. Perhaps the most common herbicide toxicity is from atrazine, though we don’t see this nearly as often now that atrazine has to a large extent been replaced by herbicides with a shorter soil residual. Atrazine toxicity in alfalfa appears as a whitening or chlorosis of the outer margins of the leaves, followed in severe cases by the death of the plant.

Alfalfa plants produce toxins that can reduce the germination and growth of alfalfa seedlings – this is called autoxicity. These toxins are water-soluble and are concentrated in the alfalfa leaves. How long the alfalfa was growing in the field prior to seeding as well as the amount of plant debris both have impacts on autotoxicity. Heavy soil types (clay loams vs. sandy loams) are more at risk for autotoxicity because the toxins don’t leach through the soil profile nearly as fast.

Destroying an old alfalfa sod in the fall either by tillage or (particularly in no-till systems) with herbicides and seeding the following spring may not allow enough time for the toxins to dissipate. Research has found substantially poorer seedlings when alfalfa is seeded into a recent alfalfa stand compared to seeding following a different species such as corn. Alfalfa autotoxicity is usually less of a factor with conventional tillage than with no-till.

One difference between stand problems with corn vs. alfalfa is the impact that a less-than-ideal stand has on yield. Poor stands of corn almost always result in lower yields, but with alfalfa it depends on whether the problem is general across the field or in patches. Recommended seeding rates of alfalfa result in a lot more seeds per square foot than are needed for good yields. I’ve seen farmers make a big mistake in their grain drill or forage seeder settings and wind up with about half as much seed per acre as they intended. But because the population was uniform across the field, and because in most cases field conditions were very good – a fine, firm seedbed – they wound up with a perfectly adequate stand. Not that this is recommended since it was due to the combination of skill (a good seedbed) and good luck (favorable weather).

Monarch Butterflies and the Conservation Effort

butterfly-on-a-pillar

It was in 1999 that professors at Cornell University sounded the alarm: Research had found that monarch butterfly larvae eating pollen from Bt corn hybrids were injured or killed by the pollen.

It was in 1999 that professors at Cornell University sounded the alarm: Research had found that monarch butterfly larvae eating pollen from Bt corn hybrids were injured or killed by the pollen. If this involved an unattractive moth or a lesser known butterfly the news probably would have gone unnoticed by the press and the public. But the news spread far in a matter of days, largely because the monarch butterfly is one of the best known, loved butterflies in the world; its brilliant orange and black coloration make it easy to recognize. There hadn’t been a hint of Bt corn pollen affecting monarch butterfly larvae in any previous scientific literature, nor was it among the concerns cited even by people opposed to genetic engineering. In 1999 corn hybrids with the Bt trait (which produced resistance to the European corn borer) were just becoming popular. At the time these hybrids represented less than 20 percent of all field corn planted in the U.S., so this was in the early years of genetically modified organisms (GMO). This trait was rapidly becoming much more popular, however, because it was highly effective in protecting corn plants from the European corn borer.

Corn borer moths lay eggs on the underside of corn leaves. The eggs hatch and following several developmental stages (instars) the larvae become large enough to bore into the corn stalks, creating feeding cavities in the stalk that weaken it and result in lodging. (Lodging refers to corn plants breaking off, thereby becoming difficult or impossible to harvest.)

We did two years of research at Miner Institute (Chazy, New York) comparing replicated strips of Bt and non-Bt corn, with dramatic results: We found that 25 percent (first year) and 46 percent (second year) of the non-Bt corn plants were invested by corn borers, whereas in the Bt hybrid, which was genetically identical except for the Bt trait, there wasn’t a single corn borer in any plant we examined. This was true for both years of the study. I also remember how much work it was for our research staff to harvest and then split the stalk of each corn plant examined, looking for corn borer damage. This was necessary since a casual examination of the plant may miss the borer hole.

Milkweed is the only food of monarch butterfly larvae, so the Cornell researchers dusted corn pollen from a corn hybrid with the Bt-corn borer trait onto milkweed plants, looking for signs of toxicity in the larvae. Their reasoning was that milkweed is commonly found in waste areas near corn fields and the pollen from the corn plants with the Bt trait could drift from the field onto the milkweed leaves. They found that compared with larvae feeding on milkweed leaves dusted with corn pollen from non-Bt corn, the monarch larvae feeding on milkweed dusted with Bt corn pollen had decreased feeding, growth and survival rates. Their conclusion: Bt corn could threaten monarch butterfly populations feeding on milkweed growing near these corn fields.

Other entomologists challenged the validity of these results, pointing out that the amount of pollen dusted onto milkweed leaves was far in excess of anything that would be found in nature. Under natural conditions much of the corn pollen landing on milkweed leaves is blown off by the wind or washed off by rain, but under laboratory conditions there was neither wind nor rain.

What this study did, however, was to cause entomologists from other land grant universities to conduct studies of the interaction between Bt corn and monarch butterfly larvae. Extensive research has found that survival of monarch butterfly populations are not threatened by the planting of corn with the genetically engineered Bt trait. A study in Maryland better represented natural conditions; it examined the survival rate of monarch larvae exposed to Bt and non-Bt corn pollen in a corn field. Survival rates of the larvae ranged from 80 percent to 93 percent, with no difference in survival rates between the Bt and non-Bt plots.

Some people were concerned that milkweed growing as “weedy invaders” in corn fields would be especially subject to corn pollen depositions. However, farmers work hard to eliminate milkweed from corn fields, usually with good results. One change that has had a negative impact on monarch butterfly populations is the conversion of vast acreages of continuous hayland and permanent pasture in the Midwest – prime sites for milkweed – into rotated cropland growing corn, soybeans and other grain crops. With these cropping changes a greater portion of milkweed is now found along roadsides, resulting in a lot more monarch butterflies meeting their fate on the grills of passing cars than from exposure to Bt corn pollen.

Recently there’s been good news regarding monarch butterfly populations, which have been greatly increasing from their lows of several years ago. In fact, between 2015 and 2016 the number of monarch butterflies overwintering in Mexico (their natural winter habitat) tripled. These butterflies are also getting a helping hand from the U.S. Department of Agriculture through its Natural Resources Conservation Service (NRCS).

In November 2015 NRCS announced a conservation effort in 10 states in the Midwest and the southern Great Plains aimed at helping farmers provide food and habitat for monarch butterflies. NRCS is providing technical and financial assistance to help farmers and “conservation partners” plant milkweed along field borders, in buffers along waterways or around wetlands, in pastures and other suitable locations where they won’t interfere with normal farming practices. NRCS is also helping farmers manage their pastures to increase populations of milkweed while not decreasing the productive capability of the pasture. Therefore, the situation has changed from where farmers growing Bt corn were wrongly implicated in the decline of the monarch butterfly, to the current program where farmers are assisting USDA efforts to expand populations.

How Are We Doing at Feeding the World?

world-globe-with-a-fork

It’s estimated that worldwide production of foodstuffs must increase by an average of 1.75 percent between now and about 2050 when the global population is expected to peak at just under 10 billion.

How are we doing at feeding the world? Not so bad, actually: It’s estimated that worldwide production of foodstuffs must increase by an average of 1.75 percent between now and about 2050 when the global population is expected to peak at just under 10 billion. In recent years we’ve been averaging 1.73 percent annual growth, with developed nations outpacing less prosperous ones. The Economist issued a report a couple of years ago on the global food situation and how likely it is that we’ll be able to meet the goal of feeding the world in the decades ahead. The report was refreshingly positive, citing better management, improved technologies (including precision farming and genetically modified crops) as reasons for optimism.

Connections and connectivity

Like The Economist, I’m optimistic about the chances of agriculture feeding the world. One reason is the increased rate of technology transfer, thanks to the internet. A high percentage of the developed world is now “connected,” and agriculture is in the forefront. In the U.S. we’ve long been accustomed to an effective connection (electronically and otherwise) between research done at land grant universities and state Extension services. Research results are forwarded to the appropriate Extension educators at university and county levels and they transfer this information to farmers and agribusiness professionals by various means – farm visits, meetings, newsletters (print and electronic), etc. However, even in what we’d consider “developed” countries, this connectivity is largely missing.

I well remember years ago on a consulting trip to a western European nation learning that a professor at the nation’s top (public) agricultural college was willing to tell farmers what he knew about a particular technology – but at a price. I’ve visited almost a dozen nations that have significant agricultural production, and with the notable exception of Canada, via Ministry of Agriculture offices in each province, there isn’t anything like the researcher -farmer connection we have in the U.S.

The internet is changing that, particularly in developed countries with good internet access. Over half of the European population has an internet connection. More than 90 percent of Japanese and 85 percent of Australians have internet connections. The most connected nation, surprisingly, is Iceland, with 98 percent of its population having internet access. China is at 50 percent and increasing. However, most African nations have very poor internet access, with less than half the population, and in some nations less than 25.

There’s no information as to whether farmers in the various nations are more or less connected than the general populace, but I’m betting it’s more. A farmer in Spain can now learn the latest on alfalfa harvest management by entering that phrase into his laptop or smartphone.

Agricultural R&D

One area of great concern is public funding for agricultural research and development (R&D), which has stagnated for the past 40-plus years and in the past 10 to 15 years has actually declined using constant dollars (thus removing any impact from inflation). In contrast, the amount of money invested by the private sector has increased tremendously since 1970. In 1970, the money invested in R&D was about equal from public and private sources. But since then, public funding in R&D has declined while private funding has more than tripled. This change has been most notable since 2000. This benefits some nongovernmental organizations – the William H. Miner Agricultural Research Institute for instance, which has seen its research income increase by multiples over the past 20 years – but there are drawbacks.

If a private company funds research, either at a land grant university or at Miner Institute, it’s almost always with a profit motive in mind. There’s nothing wrong with profit, but there are critical areas of research that at least in the short run don’t appear to benefit any particular agribusiness. An example is research involved in reducing the nutrients discharged by subsurface (tile) drainage systems, a project currently under way at Miner Institute via a U.S. Department of Agriculture grant. This type of research isn’t as exciting to the general public as, for instance, new technologies in seed treatments that increase yield potential, but in the long run it has more potential impact on agricultural production and in protecting our environment.

The nation’s seed companies aren’t taking the “dollar drain” on public agricultural research funding sitting down. In 2017, for the first time there’s a checkoff of $1 per bag of alfalfa seed, with 100 percent of the proceeds going to support crop research at public universities (no administrative costs will be paid by checkoff dollars). Most seed companies are participating in this voluntary program.

Agricultural education and expertise

Land grant universities and other publicly funded agricultural colleges have also felt the pinch of reduced state and federal funding. This has affected both undergraduate teaching and the expertise available at these colleges to farmers and agribusiness. Over a period of about 10 years, one state university with which I’m familiar lost its Extension entomologist, plant pathologist, weed scientist and soil fertility specialist – partly, but not entirely, due to retirements. The last time I checked, none of these positions have been filled. Therefore when the university conducts refresher training courses for farmers wanting to maintain their Pesticide Applicator licenses there isn’t anyone left at that university to provide the training – it has to “import” expertise from other institutions. In some cases this is resulting in public-private partnerships, but this only makes up for a small part of the long-term dollar drain. There’s never been a better time for farmers to become active in influencing public policy decisions, locally and otherwise.


Guide to Reduced-Lignin Alfalfa

image-of-alfalfa

Farmers have greatly increased the quality of alfalfa fed to their livestock, but primarily through management changes.

In the almost 20 years I’ve been writing for FARMING magazine I’ve never covered the same topic three times in a year – let alone three times in six months. That is, until now because reduced lignin is generating more excitement in the alfalfa seed business than any genetic advance I can remember. It may be a breakthrough in efforts to improve the quality of “the queen of forage crops.” Farmers have greatly increased the quality of alfalfa fed to their livestock, but primarily through management changes: earlier harvest, improved windrow management to dry the crop more quickly and retain more leaves and silage inoculants to better preserve forage quality.

There’s still precious little university data on the relative performance of reduced lignin alfalfa. There are three types of reduced lignin alfalfas on the market: HarvXtra is a genetically modified variety that seems to be attracting the most excitement since it’s up to 20 percent lower in lignin than conventional varieties. All HarvXtra alfalfa varieties (sold by a number of seed companies) are also Roundup Ready, and the combination of this trait plus the reduced lignin one add up to a cost of $300 per bag. That’s just for the traits, not the seed itself. The total price (which varies among seed companies and is before any discounts) is about $600 per bag, or two to four times that of conventional alfalfa varieties. There are also two seed companies, Alforex and Legacy, selling conventionally-bred alfalfa varieties claiming higher digestibility – all without the Roundup Ready trait. According to preliminary data they appear to be higher in lignin than the HarvXtra varieties but lower than conventional alfalfa. It will take time to sort all this out.

There have been previous efforts to genetically improve alfalfa quality, but reduced lignin varieties appear to be the first to accomplish it without sacrificing yield or standability. Standability is a potentially big issue in any crop with reduced lignin levels. I remember one alfalfa variety years ago that was promoted as having superior forage quality. We planted it at Miner Institute (Chazy, New York) and sure enough, what we harvested was higher than normal in quality – but that was because every cutting lodged so badly that we left the bottom half of the plant in the field! (By the way, one way to moderately increase the forage quality of an alfalfa field that is in bloom by the time you get to it is to raise the cutterbar a few inches – perhaps harvesting at 6-inch stubble height instead of the normal 2 to 4 inches. You’ll lose some yield, but will probably find surprisingly little of that leftover stubble in the next crop, assuming it’s mowed at the normal height.

The second reason for this “reduced lignin reprise” is that there’s considerable difference of opinion as to how best to manage reduced-lignin alfalfa on dairy farms. The two commonly recommended harvest strategies are to harvest the crop at the normal stage of maturity – typically late-bud stage – or to harvest it 7 to 10 days later when the crop is in the one-quarter bloom stage. In one case farmers would wind up with perhaps the highest digestibility alfalfa they’ve ever fed; in the other case they’d have higher yields of good (but not “supreme”) quality alfalfa. Let’s look at where these alternative harvest strategies best fit.

Cut at normal harvest date

Reduced-lignin alfalfa doesn’t decline in digestibility any slower than does conventional alfalfa; it’s lower in lignin (and therefore higher in digestibility) at all comparable stages of maturity. Therefore, reduced-lignin alfalfa cut at the bud stage is much lower in neutral detergent fiber (NDF) and higher in digestibility than late-bud stage conventional alfalfa varieties cut on the same day.

This may sound like a great idea and it may be depending on how you feed this “rocket fuel.” It’s entirely possible that this alfalfa could be higher in crude protein than in NDF, something few dairy farmers are used to. Alfalfa with this analysis may feed more like a concentrate than forage. If this alfalfa is fed at a relatively low rate and properly balanced with other forages it may give a quality boost to a dairy ration. It’s certainly a feedstuff that needs to be fed with care and with counsel of a competent dairy nutrition consultant.

Delay harvest

Delay harvest 7 to 10 days and cut at 25 percent bloom. This should result in alfalfa of similar quality as varieties with normal lignin levels, though research is needed to determine exactly how many days harvest can be delayed with each type. I favor this alternative for most farmers since forage quality would be about what they’re accustomed to.

Delaying harvest by a week or more for each cutting would usually result in one fewer harvest per season. When is less more? When one fewer harvest results in higher total yield! Several years of research at the University of Wisconsin with a conventional alfalfa variety confirmed this – higher yields with three vs. four harvests per season. Prior to the advent of reduced-lignin alfalfa delaying harvest by a week or more would result in poor forage quality. Fewer harvests mean less labor and equipment cost and less wheel track damage to alfalfa plants. Delaying harvest by a week also allows the alfalfa plant to store more carbohydrates in its root system. Repeated harvests at the bud stage take their toll on the health of the alfalfa plant. When alfalfa is harvested photosynthesis stops because there are no leaves and thus nothing to transmit nutrients to the taproot. Mowing alfalfa also causes the death of rhizobial nodules and some root hairs. According to Cornell University Forage Agronomist Jerry Cherney, repeated harvests at the bud stage is an “accumulation of insults” that when combined with aggressive fall harvest management can severely deplete stands.

Where harvest of reduced-lignin alfalfa at the bud stage might work well is when a forage grass is seeded with the alfalfa and the alfalfa is harvested in the bud stage. In this case the low fiber level of the alfalfa would be balanced by the higher fiber level of the grass, which often matures a bit sooner than we want it to. Each farm planting reduced-lignin alfalfa will have to decide on which approach they’ll take: “super quality” alfalfa with no change in harvest dates, or delayed harvest, which may result in higher yields and one fewer cut. No one size fits all, but now you know where I stand on this.


What Farmers Need to Know About Crop Production

getting-back-on-land

For farmers, the first signs of spring mean ultimately getting back on their fields to spread manure, fertilize grasses and plant crops. Following are some suggestions as farmers begin another year of crop production.

For farmers, the first signs of spring mean ultimately getting back on their fields to spread manure, fertilize grasses and plant crops. The time to do these chores varies depending on where you farm, but the order in which they should be done won’t vary much. Following are some suggestions as farmers begin another year of crop production.

Fertilizing perennial forages

This isn’t spring “planting,” but for any farmer growing perennial grasses – and this includes mostly grass pastures as well as fields harvested for hay or silage – early spring is critical. That’s because some form of nitrogen, either fertilizer N or animal manure, should be applied to these fields just as the grass begins spring growth or as soon after that as possible. A wet spring may delay application but every effort should be made to do so as soon as field conditions permit, even if the grass is already 6 inches high. That’s because nutrients have a hard time getting past the dense, fibrous root system of established grass and any nitrogen not used by the first crop should be available for the second one.

By spring many manure storages are full and farmers are looking for places to spread. However, early spring grass fields are often soft, especially if there was a heavy cover of snow, and running over these fields with a tractor and manure spreader – even a tanker with wide, low-pressure tires – can cut up fields. However, this depends to some extent on the species of grass in the field. Grass species with fine root systems including timothy, orchardgrass and smooth bromegrass won’t support spring field traffic nearly as well as a species with a rugged root system such as reed canarygrass.

At Miner Institute in Chazy, New York, we were often able to spread urea in April on a long-established field of reed canarygrass with field conditions that were so wet that water was running off the tractor tires. But we never made a rut as the dense canarygrass root system provided all the support the equipment needed. Reed canarygrass has fallen out of favor with many dairy farmers because it has lower forage quality than most other grass species, but tall fescue (another cool-season grass with high yield potential) has a root system that more closely resembles canarygrass than it does other forage grasses. Will this be enough to make a difference in spring manure application? Not sure, but if you have an established field of tall fescue it may be worth a (cautious) attempt.

The “when” of spring N application to grass depends on the particular situation, but not the “if.” In fact, the economics of grass production rests to a great extent on that first crop, which represents a greater proportion of total annual yield than with alfalfa or alfalfa grass. We did research at Miner Institute on grass fertilization, applying 100 pounds of fertilizer N just as the grass started growing in the spring. Compared with the unfertilized plots, N fertilization more than doubled first-cut yields and increased the crude protein content of the grass from less than 12 percent to over 18 percent. This all happened in the approximately five weeks from fertilization to harvest. What else can you do on a farm that will double yields and greatly increase forage quality in only five weeks?

Spring seedings

On most farms forages should be seeded before corn planting begins. This is especially important if a companion crop such as a spring cereal (usually oats) or a cereal-pea mixture is seeded with the forage. That’s because as both crops begin to grow the cereal crop can aggressively compete with the smaller-seeded legumes and grasses for soil moisture and soil moisture conditions are usually better in the early spring. Seeding equipment: Where soil phosphorus levels are good either a broadcast drill (such as a Brillion Seeder) or a grain drill with band seeding capability can be used with good results. However, if soil-test P levels are low, my strong preference is for use of a grain drill since with a drill the fertilizer can be applied in a band under the seed row. Research has shown this method of phosphorus fertilization to be several times more efficient than broadcast P application. Many farmers wouldn’t dream of planting corn without any fertilizer in the hoppers, but the same principle holds when seeding forage crops: Supply phosphorus-containing fertilizer at such a time and place that the germinating seedling has ready access to it. Alfalfa and grass roots head straight down after the seed germinates, which is why the fertilizer is placed directly under the seed row. This used to be a much more common topic a generation ago when soil test P levels on dairy and other livestock farms were generally lower. However, as dairy farming has intensified – more cows, and therefore more manure, per unit of land – soil phosphorus status has increased so there are fewer fields with low soil test P levels.

One reason Brillion Seeders have been so popular with farmers despite their lack of fertilizer hoppers is that they don’t bury the seed too deep and do an excellent job of firming the seedbed during seeding. This is also possible with new and old grain drills, especially if they’re equipped with press wheels that trail behind the seed tubes. If neither a press wheel drill or a Brillion Seeder is used then the soil should be firmed right after seeding with a roller or cultipacker (I prefer a cultipacker). “Right after seeding” means within a day or two, not a week later. Firming the soil following seeding is important, but in some cases the seedbed may have been tilled so much that it’s fluffy and should be firmed before as well as during or after seeding. The goal is to have a seedbed firm enough prior to seeding that a boot imprint is no more than the thickness of the sole. (A possible exception to this rule of thumb would be a fat farmer with small feet!)

Fertilization more than doubled first-cut yields and increased the crude protein content of the grass from less than 12 percent to over 18 percent.

Corn planting

Corn planting typically starts after soil temperatures are at least 50 degrees Fahrenheit at a depth of 4 inches, but farmers with a lot of corn to plant sometimes push the envelope and start a week or so earlier. The climate has changed in the past 30 years or so due to global warming and or climate change, so you should actually measure soil temperature instead of planting at the same calendar date your father (or grandfather) did. How aggressive a farmer needs to be about starting to plant corn depends on how much corn he or she has to plant. It’s not nearly as important when you start planting corn as when you finish. That’s because the first corn planted often isn’t the highest yielding, but as planting extends past the ideal planting window – the last week of May in much of the northeastern U.S. – yield potential begins to decline.

About as important as soil temperature is soil moisture status. Farmers will have much more success planting into soil that’s borderline low for temperature but dry enough to plant versus planting into a cold, wet soil. It’s better to wait until the field is ready to plant, even if it’s a week later, than to “mud in” a field. Cold injury to germinating corn is much more of a problem in wet soils.

Another factor that permits the planting of corn slightly earlier than farmers did a generation ago is much more effective seed treatments. In the old days seed corn treatment was done by the farmer as a planter box application. Adherence of the seed treatment – typically containing insecticide and fungicide – was often somewhat less than ideal, in part due to the occasionally lousy job farmers did in mixing the treatment with the seed corn. By the time the corn kernel was placed in the seed furrow much of the treatment wasn’t on the seed and it was susceptible to attack by seed corn maggots and the fungi-causing seed rots.

Most seed corn (unless it’s for organic production) comes pretreated by the seed company, which is much more effective and one less thing for farmers to do. Precision corn planting equipment has also resulted in more uniform seed placement. All of these contribute to better results from early planting. We used to recommend that farmers overplant seed corn by about 15 percent, but we’ve come so far in germination percentage, pest control and planting equipment that some seed companies recommend overplanting by only 5 percent, increasing the overplanting rate to 10 percent under challenging conditions such as very early planting or cloddy soils.

In summary

Spread nitrogen fertilizer or manure on established grass fields as soon as the grass starts growing. If you can’t do it then, later is better than not at all.

Seed forage crops into a fine, firm seedbed, paying attention to the soil test phosphorus status of fields to be seeded. Band fertilizer application is more important if soil test P is low.

Plant corn as soon as temperatures at 4 inches soil depth reach 50 degrees Fahrenheit. How much risk you take in planting earlier than this depends on soil conditions and how long you expect it to take to complete corn planting. When you finish has more impact on yield than when you start.

Western Bean Cutworm: Genetic Traits

genetic-traits-image

One of the big field crop stories this past fall was the reported inability of a genetically modified trait, Cry1F, to adequately control western bean cutworm – not in beans, but in field corn. This trait became available about 15 years ago, at which time western bean cutworm was primarily found in the western Corn Belt states of Colorado, Idaho, Nebraska, and Wyoming. Of these, only Nebraska and Idaho are major corn-growing states.

At the time, seed company marketing literature barely mentioned the Western bean cutworm, but this changed after Iowa State University entomologists discovered economic damage from this insect in that state. Formerly a pest of the Western Plains, now the western bean cutworm has rapidly spread eastward into the northeastern U.S. including all of New York, most of Pennsylvania and parts of New England. It’s been confirmed in damaging numbers in northern New York, so cold winters apparently aren’t a deterrent!

In 2016 corn-growing farmers generally assumed that the Cry1F trait adequately controlled the western bean cutworm, but damage was severe in many fields. By the time people realized that the trait wasn’t effective on this pest it was too late to apply insecticide as a rescue treatment. The western bean cutworm is now the primary insect pest affecting corn ears, and in a still-expanding area. The situation is so serious that entomologists from five land-grant universities (including Cornell University and Penn State) have written an open letter urging the seed industry to remove the “control” designation from the Cry1F trait regarding western bean cutworm.

Be a good scout – or hire one

Unlike the western corn rootworm, the failure of the Cry1F trait to adequately control western bean cutworm has little to do with farmers’ crop rotations or a failure to follow recommended pest control measures. That’s why as the insect moves into new areas it can wreak so much havoc in corn fields.

It’s important for someone – farmer, crops consultant or trusted employee – to scout corn fields so that action can be taken if necessary. Cutworm larvae overwinter in the soil, emerging in early summer and becoming moths that mate and lay eggs on corn leaves. The larvae feed on the leaves and pollen, later moving into the ear.

The best way to scout for western bean cutworms is to check for egg masses on the upper flag leaf, checking at least 20 plants each in several parts of a field. Field scouting is key because as is true with many insect pests, cutworms are much more effectively controlled if the problem is detected early. One source recommends an insecticide application if at least 5 percent of plants have egg masses on the upper leaves or in the whorl. Many insecticides are labeled for western bean cutworm, but the registration status of the insecticides may vary by state so check with local authorities and as always – read the label.

This is a rapidly evolving situation, and recommendations may changed by this summer. Just because a farm is outside the current “known area” for the cutworm doesn’t mean you won’t have problems. It’s likely that by this summer the insect will have spread throughout most corn-growing areas in New England. Discuss the various genetically modified trait options with your Extension educator and or seed dealer. Field evaluations by Cooperative Extension in Northern New York last September found a wide range in western bean cutworm damage depending on which genetic trait was used.

Prevention vs. control

Management of the western bean cutworm will likely be similar to that of other cutworms in that crop rotation will have little impact on infestation level. A first-year corn field may be about as likely to have an infestation as a fifth-year one, especially if there are other corn fields in the area. That’s because cutworm moths can fly from field to field in depositing eggs.

The most effective means of control appears to be use of genetic traits proven to be more effective than Cry1F in controlling this insect. Unlike the situation with the western corn rootworm, the control problem with western bean cutworm isn’t that the insect developed resistance to a particular genetically modified trait. It took years for rootworms to develop resistance. The process undoubtedly sped up by some farmers planting continuous corn while using the same GM traits year after year, and in some cases by ignoring the required planted refuge area.

I’ve noted before that to some extent rootworm resistance is a case of (to quote Walt Kelly’s Pogo) “We have met the enemy, and he is us.” Although entomologists think that resistance would have occurred regardless of farmer action (or inaction), this isn’t any excuse for ignoring the requirements. (These aren’t suggestions or recommendations but requirements that farmers must agree to in writing.)

My recommendation: If you see an online article or article in a farm magazine about western corn cutworm, be sure to read it since it may contain new information and or suggestions for control. There may be specific recommendations of what GM traits are most effective in combating this pest, and those that either should be avoided or used only as part of a “stack” with other GM traits.

Unfortunately, organic farmers and others who choose to use neither insecticides nor genetically modified corn have no obvious means of prevention or control given the life cycle of this insect. Furthermore, western bean cutworm affects field and sweet corn. It may become a serious pest of sweet corn growers. Even a modest infestation can be a turnoff for sweet corn lovers, who won’t be thrilled by husking an ear only to discover that a cutworm has set up housekeeping there.


What Dairy Farmers Should Be Doing Now for Forage

dairy-farmer-image-of-forage

Crop inventory management is always recommended but is especially important where forage supplies are limited.

If you harvested corn for silage this past fall, this article provides some guidelines to consider. It’s risky to generalize when addressing crop quality or quantity, given the wide coverage provided by FARMING magazine. The situation may be even more varied this winter because some parts of the region had their worst drought in at least 20 years while crops were good to excellent in other areas that got several timely midsummer rains.

On the plus side, an alfalfa trial in central New York was cut five times with some plots yielding about 9 tons of dry matter per acre, something that’s all but unheard of in the eastern U.S. On the minus side, some farmers in northern New York reported no measurable rainfall for two months this past summer. All that said, here are some nuggets to consider:

Manage crop inventories

Crop inventory management is always recommended but is especially important where forage supplies are limited. Cows hate sudden changes in forage quality or type, so it’s better to feed a moderate rate of corn silage year-round than to feed a lot for the next six months or so and then run out next summer.

However, there are two caveats. First, especially where corn silage is stored in bunker silos or drive-over piles, feed enough to prevent spoilage. Opinions differ as to whether “splitting a bunk” – feeding only a portion of the face for a period of time – is a good idea. You’ll suffer spoilage losses on the face of the silage left exposed for days or weeks, but in some cases this may be preferable to not feeding enough each day and winding up with warm and or moldy silage. I generally recommend against bunk splitting but there may be times when it’s the best option. In most cases proper planning should prevent this scenario.

Second, I’ll note again that brown midrib (BMR) corn silage is different in many ways. For example, a difference is in the feeding rate needed for an economical milk response. Research and farmer experience have concluded that when feeding BMR corn silage the ration needs to include at least 10 to 12 pounds (dry matter basis) of this forage. Therefore, if your BMR corn silage is 33 percent DM you would need to feed at least 30 to 36 pounds of silage. These rates are minimums – many farmers feed somewhat more than this.

I remember years ago hearing of a state university dairy herd that was getting BMR corn silage for the first time. They started feeding the BMR silage at a modest rate – as I recall, 6 to 7 pounds of dry matter per cow per day – and were very disappointed when the cows barely budged in milk production. Then after “asking the experts” they doubled the rate of BMR in the ration and the cows jumped 8 or 9 pounds – and this was a high-producing herd!

With that in mind, what should you do if you have BMR corn silage but not enough to feed year-round at the minimum recommended rate? Assuming that the crop is properly ensiled and not subject to high storage losses, it may be best to wait until warm weather to start feeding BMR. Many farmers do this after concluding from experience that responses to BMR corn silage are better during times of potential heat stress.

Real vs. ideal

Ideally you shouldn’t start feeding “new crop” corn silage until it’s fully fermented, and then some. The “then some” is because changes continue in the crop long after what’s considered normal fermentation is complete. Any changes in fiber digestibility occur quickly, during the heating/cool-down process, but starch digestibility continues to increase for several months after whole-plant corn is ensiled. These increases are significant enough to have an impact on rations – in this case a positive economic impact since higher starch digestibility in your corn silage means less corn you have to purchase as grain. We’ve heard of dairies feeding high rates of corn silage that during the spring pulled a sizable amount of corn meal out of their rations – and the cows went up in production!

That’s the ideal, but the reality this winter, following a severely dry summer in much of the region, is that many farmers didn’t harvest nearly as much 2016 corn silage as they’d intended. But this doesn’t necessarily mean that you should start feeding new crop corn silage any sooner. If you put up enough corn silage that you wouldn’t have to start feeding new crop until January or later, you should just be getting into what for many will be a short crop.

What to do now? Certainly have a “meaningful discussion” with your dairy nutritionist, but also do an inventory of all silages. There are silo capacity tables in print and online for all types of silo storages. The variables are type of crop and (for bunker silos and drive-over piles) silage density since this is influenced by how well the crop was packed as it was ensiled. If necessary, ask your Extension educator or dairy nutritionist for assistance in approximating (or ideally, measuring) silage density. Drive-over scales make this a lot simpler – if you used them during harvest. If you have several silo storages, make a simple diagram with what forage is stored in each along with the approximate amount. With this information you and your dairy nutritionist can estimate how long the silage in each silo should last, given the current rates of feeding. Knowing this now gives you more time to make the necessary adjustments in ration formulation or perhaps even purchasing supplemental forage. I don’t like the idea of moving silage, but if you have to move it from seller to buyer, it’s better to do so during the winter when spoilage losses should be less.

A Look Back at a Hot, (Mostly) Dry Summer

bale of hay

In the April 2016 issue, we noted that some weather models predicted hotter-than-normal conditions in the Northeastern U.S. This came to pass, but what the models didn’t predict was the moderate to severe drought conditions that affected much of the region.

Precipitation varied widely from almost normal in northeastern New York to bone dry in other areas. One farmer in northern New York reported that he didn’t have any measurable rainfall between June 5 and Aug. 13, and New York agricultural officials reported it was the most serious drought in that state in at least 20 years.

Rainfall (actually soil moisture) and air temperature affect crop yield and quality. Hot weather decreases crop quality and increases the rate of maturity. But what’s really bad for crops is the combination of heat and excess soil moisture.

Despite many days when the temperature exceeds 100 F, Arizona farmers grow excellent quality alfalfa with sufficient water, usually provided by irrigation. High temperatures in the absence of excess water can result in good yields and high quality. Summers in the southeastern U.S. are also hot, but there’s usually an abundance of humidity and rain. This is terrible for anyone trying to grow alfalfa, which is why this species isn’t nearly as common in the Southeast. High humidity increases disease incidence above the soil surface, while saturated soils also take their toll.

Lignin, part of a plant’s cell walls, is derived from the Latin word meaning “wood.” Lignin is about one-third of the dry weight of wood and is responsible for strengthening the wood in trees. Lignin is also present in crops; without lignin, alfalfa and corn plants would be lying flat on the ground Although a little lignin is essential, too much is bad because it’s indigestible by dairy cattle and other ruminants.

The combination of heat and wet soils affect forage quality, but in the case of alfalfa and other forage legumes such as clover and birdsfoot trefoil, the main impact is on the stems, which contain much of the lignin found in the plant. The leaves don’t have any structural function in the plant so they contain very little lignin. Therefore, alfalfa leaves aren’t much affected by the weather conditions that can wreak havoc on the stems. Hot weather increases the leaf-to-stem ratio of alfalfa, so while yields may be lower, the forage quality of the entire plant may be similar to alfalfa grown under more favorable conditions. As temperatures rise, the difference between the low and high lignin parts of the plant increase. Therefore, whether it’s a hot summer or a general warming of the climate, it’s important to manage alfalfa to retain as many leaves as possible. Just as the low-fiber grain in corn harvested for silage is Mother Nature’s “crop insurance,” so are the leaves on an alfalfa plant.

Managing alfalfa for high leaf retention

Farmers can take two steps to retain a high proportion of alfalfa leaves from mowing through baling or ensiling. The first is the type of mower-conditioner used, rollers or impellers (also called flails). Rollers create a crushing action, while impellers have a stripping action. Their use depends on the species of hay crop. Research in the U.S. has found when alfalfa is conditioned with an impeller conditioner, there’s a small increase in leaf loss compared with rollers.

Italian researchers found that alfalfa conditioned with an impeller resulted in alfalfa hay that was about 1 percent lower in crude protein than the same crop conditioned with rollers. The University of Wisconsin reported similar results: Impellers result in 2 percent to 3 percent higher field losses with alfalfa, and all of these losses are leaves, so quality is reduced significantly. Roller conditioners are also reported to result in a faster drying rate of alfalfa.

I’m not opposed to impeller conditioners; in fact I think they do a superior job in forage grasses. That’s why they’re so popular in Europe, where there’s a lot of intensively managed grass but not a lot of alfalfa. While the drying rate of alfalfa is slower with impeller conditioners, the drying rate of grasses is higher. Each type of conditioner has its place.

The second step in alfalfa leaf retention is windrow management, specifically the width of the mowed windrow. Wide windrows dry the alfalfa more evenly than the same crop managed in narrow windrows. There’s been plenty of research on this but we still have a long way to go since many farmers are still managing alfalfa in 3- or 4-foot wide windrows. Consider what must happen in these narrow windrows for the entire windrow to average 35 percent to 45 percent dry matter, which is the normal range in dry matter contents we usually see on farms.

The alfalfa in the middle of a narrow windrow dries very slowly – perhaps not at all for many hours after mowing. Meanwhile, the alfalfa on the top continues to dry until the leaves are bone-dry. When the alfalfa is chopped, these leaves shatter and either fall directly to the ground or are pulverized by the chopper knives and blow back onto the field instead of into the self-unloading wagon or forage truck. This can result in a loss of several points of crude protein between stem and silo.

With careful crop management, we can make the most of the weather that comes down the pike.