The concept of precision agriculture has been in place since the late 1980s, intending to implement dynamic farm management practices based on infield observations and real-time measures of the climate, field, and crop conditions. Consequently, precision agriculture allows farmers to effectively use land, water, fuel, fertilizer, and pesticides. The benefits and needs of precision agriculture are apparent. It allows farmers to manage their fields more precisely, expecting reduction of resource input. Moreover, precision agriculture has significant environmental implications, especially considering soil and water sustainability.
In its early evolution phase, precision agriculture technologies primarily focus on improving nutritional inputs, cooperating with weather and climate conditions, and identifying early signs of crop health issues. Technologies focused on more granularity than the field level, such as plot level. Over the past decades, the application of precision agriculture has continued to grow, enabled by the advent of technology, especially GPS technologies. During the first wave of development, precision agriculture technology was mainly supported by localized computing powers and applied calculated mythologies to the existing solutions. In most cases, research was conducted at research agencies and focused enterprises due to the relatively high development costs.
“With the assistance of technology, farmers still need to make holistic decisions by analyzing the information received from multiple technology providers”
The situation has changed with the development of cloud computing that started six to seven years ago. The adoption of cloud computing significantly encouraged the technology development in the agriculture sector, which benefited mainly from the low cost and flexibility of scaling up and down. The industry has started to observe a blooming of technologies on the market offered to farmers and other players in the agriculture and food value chain.