Printer Friendly

Organizational and Technological Aspects of a Platform for Collective Food Awareness.

1. Introduction

Modern food consumers are ever more engaged in open discussions, comments, and feedback on characteristics, quality, and safety of food that has become a very trending topic (to give an idea, think of the many food pictures and messages that are daily posted on online social media). Also, among them, food consumers communicate and interact with food suppliers and third parties in loose, open, effective, and flexible ways in a continuous search for food information transparency and more visibility of food supply chains.

On the other hand, new technological advances, especially in food sensor miniaturization, have made possible the development of lab-on-smartphone platforms for mobile food diagnostics that allow a rapid and on-site food analysis for preliminary and meaningful food information extraction. These lab-on-smartphone platforms use hand-held and low-cost devices (e.g. food scanners or food sniffers) to capture and communicate food data (e.g., data from measures of physical, chemical, biological, and microbiological food properties) or food-related entities data (e.g., data from label, package, container, and environment) with some specialized smartphone/tablet apps. These devices are easy to use and incorporate an analytical precision and resolution almost equivalent to bench-top instruments.

These trends let envisage future scenarios where consumers and other stakeholders of the food supply chain, using their own capabilities integrated with ICT and food diagnostics technologies, could collaboratively constitute a large-scale socio-technical superorganism capable to foster collective food awareness. Here, we refer to collective food awareness (shortly, CFA) as food beliefs, knowledge and information, shared within a consumer community, that drive food consumption patterns of community members in terms of culinary preferences, and food habits and needs.

The need of sharing food information and knowledge is due to the fact that quality and safety issues about food are difficult to identify and, in the majority of cases, recognizable only after their consumption. In fact, depending on the type of attribute, food is an experience (some food attributes can be determined just after purchasing and consumption) or credence good (some food attributes that cannot be determined by consumer even after consumption). In food markets, this intrinsic nature of food facilitates the occurrence of information asymmetries that deeply affect consumers' decisions and behaviour. Main consequences of asymmetric information are moral hazard (a food producer takes more risks, e.g., false labelling or food adulteration, because consumers bear the burden of those risks) and adverse selection (producers hide some food information in a transaction, leading consumers to poor decisions making).

A broad CFA contributes to make many "problems" linked up with information asymmetries vanish and beyond that could drive consumers to greater consciousness about health, and environmental choices compatible with social goals. It can be fostered by a sociotechnical infrastructure based on a platform that empowers consumers by collectively managing (generating, verifying/validating, and distributing) information on safety and quality of food products and processes, as well as on issues around environmental, social, and ethical aspects.

In line with other works on collective awareness platforms [1-3], we view a CFA platform as an ICT system leveraging for gathering and making use of open food data, by combining social media, distributed knowledge creation and IoF (IoF (Internet of Food) is an offshoot of the Internet of things. It can be viewed as a network of smart food things, i.e., food-related objects and devices that are augmented with sensing, computing, and communication capabilities in order to provide advanced services. Smart food things include sensor-equipped information artifacts (e.g., food labels with RFID or NFC tags), time-temperature indicators, and other sensors on packages to detect spoiled foods, sensor devices that spots bacterial infection in food and water, kitchen devices that generate a record of compliance with food safety protocols, wearables to count bites and estimate calories, and so on [4]) technologies, in order to support the creation of CFA within a food consumer community.

A general research question that is crucial for sociotechnical infrastructures aimed to create a CFA is the following:

How can a CFA platform empower food consumers to have control over their own food and be responsive to their expectations of reliable food information?

In this paper, we focus on four implied questions flowing from this general question and reflecting different point of views:

(1) How can a consumer community share reliable food information derived from food properties instrumental measurements performed by consumers?

(2) What is the functional architecture of a CFA platform that supports such a process and lets a consumer community share reliable food information?

(3) What are the entities with their relevant properties characterizing the CFA platform interaction context?

(4) Which technologies can allow a CFA platform to generate food information based on scientific instrumental measurements of food properties?

The rest of the paper includes a short background discussion on the superorganism paradigm and four sections devoted to answer these questions.

2. Backgrounds

As people are increasingly becoming connected and active participants in smart environments, the convergence of "Internet of Things" and "Social Networks" worlds is gaining momentum in many researches [5], paving the way to a new generation of "user-in-the-loop" context aware systems [6]. The challenge is to harness the collaborative power of ICT networks (networks of people, of knowledge, and of sensors) to create collective and individual awareness [7].

A single "individual" is characterized by heterogeneity and limited reasoning capabilities, acting in an autonomous way within a smart environment. However, when many individuals join together they can self-organize into large-scale cooperative collectives, based on the assumption that a large number of individuals tied in a social network can provide far more accurate answers to complex problems than a single individual or a small group [8]. According to this perspective, the very large number of interconnected objects or people can be exploited to create what several researches define "superorganism" [9] or "swarm intelligence" [10], since they exhibit properties of a living organism (e.g., "collective intelligence") on their own. In fact, such approach is inspired by self-organizational behaviour of complex systems in nature [11], with particular reference to ant colonies. While a single ant has very limited sensing and actuating capabilities and little or no cognitive abilities, by and large, ants can indirectly coordinate their movements and activities, via spreading and sensing of pheromones in the environment, exhibiting, as a colony, a very powerful collective behaviour [12].

Collective intelligence and nature-inspired computing represent an extremely interesting phenomenon that has been addressed in several application fields, e.g., smart cities [13], manufacturing [14], healthcare [15], energy [16], and finance [17].

The food sector is another promising application area. The increasing demand on safe, high-quality, and healthy food, the recent food safety incidents and scandals, and the availability of new smart food technologies have led substantial changes in both food consumer's behaviour and food information user's behaviour [4, 18]. Today's consumers may have access to a wealth of mobile app-based services that provide them with food information (food traceability, nutrition advices, recipes, and purchasing support). At the same time, new digital businesses can collect and process big amount of food data through data analytics and intelligence tools for better understanding food consumers and increasing food processes effectiveness.

Moreover, the coupling of smart food technologies with social networking technologies is disclosing a world where consumers can interact, communicate, and collaborate with each other in loose, open, effective, and flexible ways for enhancing the transparency and visibility of food supply chains through collective wisdom and intelligence [19].

In a similar way we see that individual ants behave as if they were a single superorganism; we can envisage a near future where food consumers are engaged in large-scale coordinated activities for the good of everyone. In our opinion, it is advisable that some of these activities should address the creation of CFA. Although the superorganism paradigm has been employed for building collective awareness in many fields, prior research has not been explicitly focused on organizational and technological aspects in creating CFA within a consumers' community.

3. Collectively Generating and Sharing Reliable Food Information

As a first attempt to answer question 1, described in the introduction section, we introduce a process that allows a consumer community to share reliable information on food performances of some food items belonging to a same food class. In our process model, we assume that the reliability of a food performance is determined by a collective interpretation of food items' characteristics that are derived from instrumental measurements performed by some consumer community members. According to Peri [9], we refer to food characteristics as physical, chemical, biological, and microbiological food properties that are objectively attributable to food and do not change by changing the consumer (food shape, weight, size, structure, and composition, in terms of chemical or bioactive compounds). We refer to food performances as functional and subjective food properties; i.e., they relate to the consumer and do not exist except in the interaction between food products and consumers. They include sensory, nutritional, safety, and aesthetic properties.

In what follows we describe the process under a perspective that addresses its structure in terms of components and roles, and we include a process scenario.

3.1. Process Actors and Roles. Main roles, actors, and interrelationships are the following:

(i) Recipient (R): he/she is a consumer community member who needs reliable information about a food item performance. He/she makes a request r(i, p) to a Food Information Broker, where i refers to some identity property values of a food item (e.g., a product batch number, production date and place, etc.) and p is the identifier of a performance he/she wants to know the value. In order to provide these data, he/ she possibly interacts with a technological CFA platform through his/her own handheld device and Food Information Artifact (FIA) (according to [20], a FIA is a physical entity expressly created to bear food information (e.g., labels, tables, RFID chips, and NFC tags)) located in the surrounding spatial environment.

(ii) Contributor (C): he/she is a consumer community member that contributes to the process by providing a Food Information Broker with some food item data. In particular:

(a) he/she implicitly or explicitly acquires food item data through smart food things, i.e., sensor devices that capture implicit or explicit signals from a food item (e.g., food near-infrared emission, food volatile compounds) or the consumer body (e.g., blood glucose level, chewing sound, and skin temperature);

(b) he/she explicitly acquires other descriptive identity data of a food item (e.g., batch number, production date, and provenance) from a FIA;

(c) he/she uses his/her own handheld device to communicate acquired food item data to the Food Information Broker.

(iii) Food Information Broker (FIB) is an intermediate agent that plays a threefold role. Firstly, it receives a request r(i, p) coming from R, and controls if it has been already satisfied. Otherwise, it submits a new challenge question to a Collective Challenge Solver (CCS). A challenge consists in knowing to what extent food items with same values i share the same value of p, and, possibly, in finding this value. Secondly, it possibly receives challenge answers from the CCS, and makes them understandable (human-readable) to R. Thirdly, It receives and controls both data acquired by C and other interaction context data captured by environmental sensors, and passes them to a Food Analysis Manager;

(iv) Food Analysis Manager (FAM) is a food data analyst that is able to perform a food item diagnosis. It receives food item data and other interaction context data from FIB, and applies some intelligent methods to determine food item characteristics. Generally, these methods analyse food item data versus food characteristics specific knowledge through machine learning techniques and/or statistical analysis (such as principal components analysis, supervised pattern recognition techniques). For instance, classification-based methods match food item data against class models in order to determine a value of a single food item characteristic. Food item diagnostics and identity data are successively sent to a Food Journal Manager;

(v) Food Journal Manager (FJM) is a food database manager that collects and organizes data coming from FAM. It also provides results of query q(i, c) formulated by a Collective Challenge Solver. Query results consist in a set of values of characteristics c for food items having the same identity properties i;

(vi) Collective Challenge Solver (CCS) is an intelligent agent that plays the core role in the collective process for generating reliable food information. It receives from FIB a challenge question consisting in finding the value of the food performance p that is possibly shared by all food items with the same identity properties i. Leveraging on a food knowledge base, it selects food characteristics c that are factors of food performance p. It formulates the query q(i, c) to FJM and, once obtained query results, it applies collectively reliable criteria in order to possibly determine the value of the food property shared by food items with same value i. A Reliability Authority establishes these criteria whose application may require the CCS to use specific methods (e.g., statistical methods, machine learning, neural networks) [21, 22];

(vii) Reliability Authority (RA) is an organizational entity that is responsible for the process governance. It sets and manages the criteria that CCS uses to provide reliable information on food performances of some food items belonging to a same food class. These criteria consist of rules that underpin a collective interpretation of food items' characteristics and determine reliability of information on food performances derived from those characteristics.

3.2. The Process Flow. In what follows, we give a description of the process flow that is also visually represented in Figure 1. The process flow consists of two streams, say 1 and 2, which are started by R and C, respectively.

In stream 1, R needs reliable information about a food item performance p. He/she provides FIB with some identity property value i and asks FIB for the value of p on the food item. FIB controls if the request can be immediately satisfied by consulting a solved challenge database that collects answers given to previous requests. Otherwise, FIB submits a new challenge question to CCS. CCS identifies food characteristics necessary to determine the value of p and asks FJM for their values on all food journal items with the same value i. CCS controls these data and decides if the value of p can be computed and collectively reliable criteria (established by a RA) are applicable. In positive case, CCS determines the value of p, and it both inserts the new record in the solved challenge database and sends the challenge answer to FIB that makes it understandable to R.

In stream 2, C examines a food item through his/her own devices (smart food things) in order to acquire measurement data of food item properties. He/she provides FIB with these data and descriptive identity data, say id, of that food item. FIB collects and controls them as well as other interaction context data captured by environmental sensors, and it passes the whole data to FAM that determines some food characteristic values, say c, by performing a food item diagnosis. The pair (id, c) is sent to FJM that stores it a Food Journal.

3.3. Exemplification Scenario. In what follows, we present a scenario to clarify the collective process described above.

A consumer community faces the problem of knowing relevant water performance (e.g., safety) of a branded bottled water. A community member can act as contributor (C) and/or recipient (R).

Cs are community members that are equipped with lab-on-smartphones (taste-analysis based devices connected to a smartphone), capable to acquire data on electrical impedance of water. Each of them examines a sample of water, acquires electrical impedance data, and transmits them to the FIB with some descriptive identity data (e.g., "product batch number"). FIB collects and controls these data coming from many Cs, and it sends them to the FAM that makes a diagnosis of the sampled water. FAM applies some methods, e.g., multiple regression analysis or principal component analysis to identify chemical compounds (e.g. "magnesium," "calcium," "sodium," poisoning elements as "cyanide," heavy metal pollutants as "copper," and "arsenic") [23] and microbial properties (e.g., pathogenic bacteria as "coliform group" and "escherichia coli") [24]. These water characteristic values of the water sample are permanently stored in the Food Journal.

R is a community member that needs to know performance values (e.g., safety) of a branded bottled water b. He/she uses his/her smartphone to scan the label of b to acquire the product number of the batch that b belongs to, and he/she queries the FIB about the safety of the water contained in b. FIB acquires the R's request and determines if it is well formed (e.g., "batch number" correctness, water performance checkability). If this request had not been previously solved, the FIB submits the following challenge to the CCS: "determine if all bottles in the batch of b are safe." The CCS selects water characteristics (e.g. cyanide, heavy metal pollutants) that it needs to know in order to solve the challenge. Successively, it queries the FJM to obtain characteristic values referring to previously analysed bottles belonging to the batch of b. Once obtained these values, it solves the challenge by applying some methods based on some collectively reliable criteria (established by the RA). In carrying out its activity, the CCS could apply some machine learning or statistical methods to establish:

(i) What is the set of water characteristics (e.g., escherichia coli, cyanide, copper, and arsenic)?

(ii) How they combine in order to obtain category inspect indicators (e.g., pathogenic bacteria, heavy metal pollutants, and chemical contaminants)

(iii) How to use these indicators to determine the water safety performance.

Lastly, the CCS sends the challenge answer to the FIB that could possibly generate a hazard warning for collective awareness of a safety risk related to the water bottles' batch which b belongs to.

4. Functional Architecture of a CFA Platform

In what follows we describe a high-level architecture for a CFA platform, as it can support the collective process for sharing reliable food information. The architecture, illustrated in Figure 2, is structured as a classic three-tier architecture commonly found in today's software applications:

(i) An interface layer that enables the user to submit, retrieve, and manipulate data

(ii) An application layer that performs data processing and analysis

(iii) A storage layer where information is stored and retrieved from a persistent database.

In our platform architecture, the interface layer is the frontend interface between the user/consumer and the CFA platform back-end, and it is responsible for interactions with the external environment (user's request formulation, sensor data acquisition, and information presentation/visualization to the user). In particular, the interface layer comprises simple and empowered nodes that are used by the CFA platform to interact with the user, food items and the surrounding environment. A simple node comprises user interface devices while an empowered node include also smart food things, environment sensors and wearable devices, where

(i) user interface devices are input-output devices (e.g. smartphone, tablet) that take input from and deliver output to the user in his/hers foreground attention. These devices are able to manage users' requests, manual data entry and acquire data from FIA (e.g. from labels, tables, RFID chips, and NFC tags) and provide human readable food information to users.

(ii) smart food things are sensing devices, owned by contributor users that are able to capture implicit signals from food (e.g. food near-infrared emission, food volatile compounds) with or without requiring user's action or attention. Smart food things can be connected and synchronized to users' interface devices.

(iii) environmental sensors are networked sensors that take environment data without requiring user's action or attention. These devices include sensor devices embedded in food packaging, containers, and food appliances and small tools (e.g., kitchen or cooking utensils), as well as ambient sensors;

(iv) wearable devices are devices that take input from the user in the background of user's attention (also called, peripheral attention), while he/she is involved in food consumption activities, such as many wearables for food intake monitoring.

The application layer comprises the following:

(i) Food Information Broker: this module has the following main functionalities:

(a) It receives unformatted digital data from an empowered node and translates them in a proper schema or grammar in order to generate a well-formed digital document (e.g., XML) representing a diagnostic data model and then send it to the FAM so that it can be processed;

(b) It receives data from a simple node and generates a formatted digital document to properly define the challenge to be solved by the CCS;

(c) It verifies if a challenge has been already solved, by querying the Solved Challenges DB.

(d) It returns to a simple node challenge results in a formatted document that can be easily processed and converted in human-readable views.

(ii) FAM Data analysis engine selector: this submodule receives the formatted diagnostic document from FIB. By analyzing document entities, it automatically at run-time selects, from the Model DB, library software modules for the FAM data analysis solver. They are the implementation of some model/method (statistical, deep learning) for determining food characteristics from sensing data. The selection can be driven by empowered node features contained in the diagnostic document.

(iii) FAM Data Analysis Solver: this submodule receives the selected software modules that complete a food diagnosis process engine. By leveraging on an auxiliary database (e.g., a food item training set), the engine produces characteristic values of a single food item and it stores them in the Food Journal.

(iv) Collective Challenge Solver: It receives a formatted challenge question from FIB. It leverages on a Food Class DB to analyze data coming from Food Journal, in order to determine the challenge results according to some collectively reliable criteria. To perform its analysis it may use complex software libraries such as extreme/deep learning machines, neural networks, classifier algorithms, clustering algorithms, and statistical/regression algorithms.

The storage layer contains persistent food data. In particular, it comprises the following:

(i) Food Item Training Set: a database containing data and inference rules to determine food characteristics of a food item.

(ii) Model DB: a set of library software modules that can complete a diagnosis process engine.

(iii) Food Journal: a public ledger containing data on food characteristics of analyzed food items.

(iv) Food Class DB: a set of library software modules that are the systematic representation of collective reliable criteria established by the RA and used, on a case by case basis, to determine a class food performance.

(v) Solved Challenges DB: a database containing challenge questions already solved by the CCS.

5. Entities of the User-CFA Platform Interaction Context

In order to support the collective process, described in Section 2, the CFA platform needs to acquire data from

(a) a user in foreground attention. The user explicitly interacts with platform interface devices that are in the foreground of his/hers attention, i.e. he/she is intentional conscious of interacting with the CFA platform. For instance, he/she could use handheld devices to get data from some food information artifacts, such as labels, RFID, and NFC tags, and, in the place where the artifacts are located, transmits them to the platform. He/she could also interact with smart food things in order to capture and communicate data on some property of a food item.

(b) a user in background attention. The user implicitly interacts with platform interface devices that are in the background of his/hers attention, i.e. they escape the user's observation. For instance, wearable sensors could provide the CFA platform with data for real-time food intake monitoring [25].

(c) a food item or the environment, without requiring any user's action or attention. Some smart things automatically detect food properties and environment conditions, and transmit related data to the platform. They include sensor devices embedded in food packaging, containers, and food appliances and small tools (e.g., kitchen or cooking utensils), as well as environmental sensors.

In what follows we summarize the main entities with their properties (attributes) that are relevant for the CFA process and characterize the CFA platform interaction context.

Context entities:

(i) user: a consumer who interacts with the platform through interface devices (including his/her own handheld devices) located in the environment, as he/she participates to the CFA process as recipient or contributor. In the recipient role, he/she asks the platform to give him/her validated information about a food attribute. In the contributor role, he/she can also contribute to the validation process by communicating food item (a class identifier and a food attribute value) and other interaction context information to the platform.

(ii) food: it refers to a food item which the user and the platform can interact with. Food related stimuli are perceived by the user and, possibly, smart food things detect signals coming from the food item. Attribute values of the food item can be exchanged during the interaction between the user and the platform;

(iii) environment: it is the physical and organizational environment where interactions take place (e.g., a home kitchen, a restaurant, and a food shop). Environmental conditions have direct or indirect influence on the behaviour of both consumer and interface devices during the interaction. Physical properties, like light, humidity, temperature, localization, and spatial layout of the environment, may affect both consumers' perception and instrumental measurements of food item properties. Organizational aspects, like rules, shop opening hours, and working time, may drive the provision of information from the platform.

Context entities attributes are

(i) Identity. It refers to properties that identify a context entity or a class the entity belongs to. In particular, the CCS of the platform can build a food class identity by inferring class properties from food item data coming from instrumental evaluations of food item qualities;

(ii) Time. It comprises temporal aspects that may range from a current time representation to a complete time history of context entity properties. When referred to a food item, values of this attribute allow the CFA platform to recognize or predict over time qualities of food items in a certain class. For instance, a time series of values of properties, like temperature, pH, or microbial growth, could be used to generate and, share, within a consumer community, information on when food items of a certain class are at their nutritional best and are safe to eat, or on when they should be disposed of to avoid ill health;

(iii) Location. Values of this attribute may be quantitative or qualitative. They represent current and previous positions of context entities in absolute or relative terms. When referred to food, location is a fundamental attribute when creating a CFA based on geographical traceability or geographic-based origin determination of food products.

(iv) Activity. It refers to fundamental changes of entity attributes that occur when a food activity is performed by a consumer. In particular, changes of food item characteristics, like surface conditions, temperature or size, could be used by the CFA platform to drive a collective awareness on consumption activities (e.g., cooking, or eating) on a certain class of food items.

6. Food Analysis Technologies for a CFA Platform: A Review of Reviews

Food analysis technologies are based on a plethora of quantitative/qualitative food analysis techniques and methodologies investigated by many researchers of various scientific fields. These methods are addressed to automatically acquire food item information (e.g., food quality traits) by using sensor devices, and they can be employed in technical approaches to the development of a CFA platform. Here, we refer to a technical approach as a collection of techniques, tools, devices, and knowledge, that is applied to measure a certain food characteristic (i.e., physical, chemical, biological, and microbiological attributes) in order to determine a certain set of food performances.

In this section, we present a review of review articles that were published from 2012 to 2017 and explicitly referred to technologies capable of nondestructively acquiring and quantifying food characteristics (external and internal quality attributes) for fast, real-time food performance assessment. The intent is to answer the following questions:

(i) Which technical approaches to food-data capture and analysis are investigated in scientific research literature?

(ii) Which food characteristics could be detected by these approaches?

(iii) Which information on food performances could be provided?

According to Kitchenham [26] we have been undertaken a systematic literature review of reviews, in order to provide a complete, exhaustive summary of current literature relevant to our research questions. The steps of the methodology we followed are below described, while Figure 3 shows the workflow we adopted:

(i) Step 0. Initialization: we selected Scopus as scientific database where to perform our search. Scopus delivers a comprehensive overview of the world's research output in our domain of reference and it has the ability to handle advanced queries. We initialized a list L of search keywords with English terms related to technologies capable of nondestructively acquiring and quantifying food characteristics (e.g., "spectroscopy," "camera photo," "e-nose," "e-tongue," and "machine vision," as well as synonymous, and other broader/wider terms).

(ii) Step 1. Search process: We performed a search on Scopus database by using keywords in the list L coupled with term "food" and other terms used for major food groups; then, we filtered retrieved papers by choosing only those indexed as reviews and published since January 2012.

(iii) Step 2. Screening relevant papers: We manually analysed metadata (authors, title, source, and year) in order to detect and remove duplicated items. Moreover, we analysed the abstract of each paper in order to determine whether it matched our inclusion criterion:

(a) the paper is classifiable as a research paper review;

(b) the review specifically focuses on research applications for detection and classification of food properties;

Moreover, the list L was possibly extended by adjoining new terms found among the author keywords of each paper.

Steps 1 and 2 were iteratively performed until no newer keywords or new papers were found. At the end of this cycle we obtained the final set R of review papers to be analysed.

(iv) Step 3. Review papers analysis. For each review paper r [member of] R we identified the set TRP(r) of technology research patterns that the paper focuses on. An element of TRP(r) is represented by a triple ([t.sub.i], [C.sub.i], and [P.sub.i]), where [t.sub.i] is a technical approach, [C.sub.i] is the set of food characteristics measured by [t.sub.i], and [P.sub.i] is the set of food performance determined by [t.sub.i] from the values of the food characteristics of [C.sub.i].

6.1. Results and Discussion. The resulting set R is constituted by 67 review papers whose references are listed in the Appendix. In what follows we present and discuss results with respect to the research question we posed at the beginning of this section. Table 1 shows the set T of technical approaches reported in the literature, Table 2 describes food characteristics that can be detected by these approaches, and Table 3 shows the set of information on food performances that can be determined.

In Table 4, for each technical approach [t.sub.i], we summarize the set of technological patterns that comprise [t.sub.i], and we indicate the review papers focusing on it.

From these results, it emerges that five class of technologies are promising to be a valued addition to the development of CFA platforms:

(i) Spectroscopy. These technologies are mostly based on vibrational spectroscopic data acquisitions and statistical analyses (e.g., principal components analysis, supervised pattern recognition techniques). The first ones collect spectroscopic data (e.g., mid- and near-infrared reflectance or transflectance data) as they measure molecular vibrations either by the absorption of light quanta or the inelastic scattering of photons; the second ones are suited to perform targeted and nontargeted screening of ingredients using spectral profiles [27, 28]. They are at the core of food knowledge-based approaches aimed to analyze foods at the molecular level. In most laboratory researches, they are used to collect spectroscopic data coming from scanned training food samples, to build a classification or cluster model according to known values of a certain property, and to determine the property value of a new food sample by matching sample's spectroscopic data against class models [29]. For example, spectroscopic analysis has been successfully applied in food safety analysis and prediction for several food categories, such as meat, fish, fruits and vegetables. In particular, the verification through spectroscopy of the freshness and the presence of any adulterants (or improper substances) in food can be based both on the chemical compounds of food and on the analysis of some properties (such as pH, TVB-N, and K1.), as well as on analytical techniques based on microbial count. Reviews highlight that several methods to assess food freshness have been developed. Such methods are based on the measurement of food deteriorative changes associated with microbial growth and chemical changes.

(ii) Machine vision. Recognition methods embedded in computer vision systems can detect visible characteristics by analyzing food images captured with a camera-enabled device (e.g., a smartphone camera photo). They can be employed to determine data relating to the mass, weight and volume of a food product and to identify its food category and subcategory. However, several reviews highlight the existence of substantial obstacles to recognize food in complex cases, such as a home cooked meal or a composite plate [30]. Combinations of these methods in conjunction with databases of food knowledge (e.g., nutritional facts tables) and consumers' profiles can be applied to provide quantitative analysis of various food aspects (e.g., amount of calorie and nutrition in the food), even in a personalized manner. Furthermore, other contextual clues, such as restaurant location and menus, can be also utilized to augment or improve the information provided by the combination of these methods [31-33];

(iii) Hyperspectral imaging. Hyperspectral imaging (HSI) is an approach that integrates conventional imaging and spectroscopy to attain both spatial and spectral information from a food object. "The spatial features of HSI enable characterization of complex heterogeneous samples, whereas the spectral features allow a vast range of multiconstituent surface and subsurface features to be identified" [34]. Applications of this technology make it possible to analyze food quality, freshness, and safety, especially for fruits and vegetables Pu et al. [35];

(iv) Odour analysis (e-noses). These technologies mimic the human sense of smell, by identifying and analyzing some food properties on the basis of its odour. The employed methods are based on an array of sensors for chemical detection of analysis of volatile organic compounds (VOCs) and a pattern recognition unit [36]. The sensing system consists of broadly tuned sensors (optical, electrochemical, and gravimetric) that are able to infer a variation of concentration a gas. Optical sensors work by detecting a shift in the emission or absorption of different types of electromagnetic radiation on binding with a desired analyte [37]; electrochemical sensors detect a variation of electrical conductivity of a gas while gravimetric sensors detect a variation of mass of a gas [38]. These technologies are mainly used to discriminate different food varieties for food authenticity and adulteration assessment [39];

(v) Taste analysis (e-tongues). These technologies are based on analytical tools mimicking the functions of human gustatory receptors. Liquid samples are directly analysed without any preparation, while solids require a preliminary dissolution before measurement [40]. Like odour analysis systems, taste analysis tools include an array of nonspecific sensors and a set of appropriate methods for pattern recognition [41]. They are employed to identify variety or geographical origin, to detect adulteration, and to assess authenticity of many food products [42].

7. Conclusions

Today's consumers have more and more need of reliable food information for their food consumption activities to become aware of the wider consequences of decisions they make. Recent cases of adulterations, allegations of fraud and subterfuges that have invested food sector have increased this trend. Current conventional ways of providing food information (e.g., labelling, mass media) have limited chance to satisfy this need, as they are usually product/producer centered and driven by food producers and distributors that tend to reveal only information that suit their marketing approach.

As opposed to that, we have introduced a democratic and bottom-up approach that lets consumers be more food aware as helping them to make more informed decisions in their food related activities. This approach leverages on the super-organism and the capabilities of smart food technologies in determining physical, biochemical, and microbiological properties of food and beverages. At its core, there is a cooperative process that is aimed to foster collective food awareness, as letting a consumers' community share reliable information derived from scientific instrument measurement of food properties.

The main contribution of this paper is to envisage the organization of such a process, as well as a technological platform capable to support it. Moreover, in order to point out significant research outcomes potentially useful for developing the platform, we have conducted a survey of academic papers reviewing technical approaches for determining food characteristics and performances.

We conclude by addressing what we view as limitations and areas for further development of this article.

Firstly, we have presented only a framework in which details of the cooperative process remain unspecified. For instance, how to define a criterion for deriving a food class performance? When do we consider "collectively reliable" such a criterion? How do we empirically assess the cooperative process effectiveness? These are relevant questions when it comes to translating our framework into concrete guidelines for the platform design.

Secondly, all the reviews in our survey have been conducted by scholars and, thus, they have been concerned with research findings oriented to clarify or discover conceptual state of a technology. A more relevant contribution would be given by investigating current gaps between technology research and mobile food diagnostics tools already available. Identifying and understanding knowledge and application gaps is vital for researchers so they can recognize technical challenge, missing insight or pieces of complementary technology in order to move forward from research to development and viability of a platform for collective food awareness.

For us, the above considerations suggest a clear direction for future research. Together with a more extensive exploration of our process model, we need empirical work that reflects both technological and food consumer behaviour perspectives.

Appendix

See Table 5.
Table 5

Review
Paper                             Reference

r1           Qu, J. H., Liu, D., Cheng, J.H., Sun, D. W., Ma, J.,
          Pu, H., & Zeng, X. A. (2015). Applications of near-infrared
            spectroscopy in food safety evaluation and control: A
           review of recent research advances. Critical reviews in
                food science and nutrition, 55(13), 1939-1954.

r2        Barbin, D. F., Felicio, A. L.D. S.M., Sun, D.W., Nixdorf,
           S. L., & Hirooka, E. Y. (2014). Application of infrared
              spectral techniques on quality and compositional
              attributes of coffee: An overview. Food Research
                          International, 61, 23-32.

r3           Wang, L., Sun, D. W., Pu, H., & Cheng, J. H. (2017).
           Quality analysis, classification, and authentication of
           liquid foods by near-infrared spectroscopy: A review of
           recent research developments. Critical reviews in food
                   science and nutrition, 57(7), 1524-1538.

r4           Wu, D., & Sun, D. W. (2013). Colour measurements by
          computer vision for food quality control-A review. Trends
                  in Food Science & Technology, 29(1), 5-20.

r5           Li, J. L., Sun, D.W., & Cheng, J. H. (2016). Recent
            advances in nondestructive analytical techniques for
          determining the total soluble solids in fruits: a review.
           Comprehensive Reviews in Food Science and Food Safety,
                               15(5), 897-911.

r6         Fu, X., & Ying, Y. (2016). Food safety evaluation based
            on near infrared spectroscopy and imaging: a review.
               Critical reviews in food science and nutrition,
                              56(11), 1913-1924.

r7           Porep, J. U., Kammerer, D. R., & Carle, R. (2015).
           On-line application of near infrared (NIR) spectroscopy
          in food production. Trends in Food Science & Technology,
                               46(2), 211-230.

r8          He, H. J., & Sun, D. W. (2015). Microbial evaluation of
          raw and processed food products by Visible/Infrared, Raman
           and Fluorescence spectroscopy. Trends in Food Science &
                         Technology, 46(2), 199-210.

r9           Magwaza, L. S., & Tesfay, S. Z. (2015). A review of
           destructive and non-destructive methods for determining
           Avocado fruit maturity. Food and bioprocess technology,
                              8(10), 1995-2011.

r10         Urickova, V., & Sadecka, J. (2015). Determination of
              geographical origin of alcoholic beverages using
          ultraviolet, visible and infrared spectroscopy: A review.
           Spectrochimica Acta Part A: Molecular and Biomolecular
                         Spectroscopy, 148, 131-137.

r11        Dai, Q., Cheng, J.H., Sun, D. W., & Zeng, X.A. (2015).
           Advances in feature selection methods for hyperspectral
          image processing in food industry applications: a review.
               Critical reviews in food science and nutrition,
                              55(10), 1368-1382.

r12         Gowen, A. A., Feng, Y., Gaston, E., & Valdramidis, V.
                 (2015). Recent applications of hyperspectral
                imaging in microbiology. Talanta, 137, 43-54.

r13         Lohumi, S., Lee, S., Lee, H., & Cho, B. K. (2015). A
           review of vibrational spectroscopic techniques for the
           detection of food authenticity and adulteration. Trends
                 in Food Science & Technology, 46(1), 85-98.

r14       Cheng, J. H., & Sun, D.W. (2015). Recent applications of
          spectroscopic and hyperspectral imaging techniques with
           chemometric analysis for rapid inspection of microbial
           spoilage in muscle foods. Comprehensive Reviews in Food
                   Science and Food Safety, 14(4), 478-490.

r15         Xiong, Z., Xie, A., Sun, D.W., Zeng, X. A., & Liu, D.
              (2015). Applications of hyperspectral imaging in
          chicken meat safety and quality detection and evaluation:
               a review. Critical reviews in food science and
                         nutrition, 55(9), 1287-1301.

r16         He, H. J., Wu, D., & Sun, D. W. (2015).Nondestructive
                   spectroscopic and imaging techniques for
             quality evaluation and assessment of fish and fish
          products. Critical reviews in food science and nutrition,
                               55(6), 864-886.

r17        Schmitt, S., Garrigues, S., & de la Guardia, M. (2014).
            Determination of the mineral composition of foods by
           infrared spectroscopy: A review of a green alternative.
          Critical reviews in analytical chemistry, 44(2), 186-197.

r18          Fox, G., & Manley, M. (2014). Applications of single
             kernel conventional and hyperspectral imaging near
              infrared spectroscopy in cereals. Journal of the
               Science of Food and Agriculture, 94(2), 174-179.

r19       Hossain, M. Z., & Goto, T. (2014). Near-and mid-infrared
           spectroscopy as efficient tools for detection of fungal
           and mycotoxin contamination in agricultural commodities.
                   World Mycotoxin Journal, 7(4), 507-515.

r20        Chen, G. Y., Huang, Y. P., & Chen, K. J. (2014). Recent
           advances and applications of near infrared spectroscopy
            for honey quality assessment. Advance Journal of Food
                    Science and Technology, 6(4), 461-467.

r21          Damez, J. L., & Clerjon, S. (2013). Quantifying and
            predicting meat and meat products quality attributes
           using electromagnetic waves: An overview. Meat science,
                               95(4), 879-896.

r22       Cattaneo, T.M., & Holroyd, S. E. (2013). The use of near
           infrared spectroscopy for determination of adulteration
             and contamination in milk and milk powder: updating
              knowledge. Journal of Near Infrared Spectroscopy,
                               21(5), 341-349.

r23       Cheng, J. H., Dai, Q., Sun, D. W., Zeng, X. A., Liu, D.,
             & Pu, H. B. (2013).Applications of non-destructive
            spectroscopic techniques for fish quality and safety
             evaluation and inspection. Trends in Food Science &
                          Technology, 34(1), 18-31.

r24            Zhang, X. (2013). Application of near infrared
              reflectance spectroscopy to predict meat chemical
              compositions: A review. Spectroscopy and Spectral
                         Analysis, 33(11), 3002-3009.

r25         Yibin, F. X. Y. (2013). Application of NIR and Raman
          Spectroscopy for Quality and Safety Inspection of Fruits
              and Vegetables: A Review [J]. Transactions of the
             Chinese Society for Agricultural Machinery, 8, 027.

r26       Lopez, A., Arazuri, S., Garcia, I., Mangado, J., & Jaren,
           C. (2013). A review of the application of near-infrared
            spectroscopy for the analysis of potatoes. Journal of
             agricultural and food chemistry, 61(23), 5413-5424.

r27         Chen, L., & Opara, U. L. (2013). Texture measurement
              approaches in fresh and processed foods--A review.
                 Food Research International, 51(2), 823-835.

r28       Dale, L. M., Thewis, A., Boudry, C., Rotar, I., Dardenne,
           P., Baeten, V., & Pierna, J.A. F. (2013). Hyperspectral
          imaging applications in agriculture and agro-food product
                quality and safety control: a review. Applied
                    Spectroscopy Reviews, 48(2), 142-159.

r29            Feng, Y. Z., & Sun, D.W. (2012). Application of
             hyperspectral imaging in food safety inspection and
           control: a review. Critical reviews in food science and
                        nutrition, 52(11), 1039-1058.

r30          El Masry, G., Kamruzzaman,M., Sun, D. W.,& Allen, P.
            (2012). Principles and applications of hyperspectral
           imaging in quality evaluation of agro-food products: a
           review. Critical reviews in food science and nutrition,
                              52(11), 999-1023.

r31           Cozzolino, D. (2012). Recent trends on the use of
           infrared spectroscopy to trace and authenticate natural
            and agricultural food products. Applied Spectroscopy
                           Reviews, 47(7), 518-530.

r32       Ellis, D. I., Brewster, V. L., Dunn, W. B., Allwood, J.W.,
           Golovanov, A. P., & Goodacre, R. (2012). Fingerprinting
            food: current technologies for the detection of food
              adulteration and contamination. Chemical Society
                         Reviews, 41(17), 5706-5727.

r33        Zhang, R., Ying, Y., Rao, X., & Li, J. (2012). Quality
           and safety assessment of food and agricultural products
            by hyperspectral fluorescence imaging. Journal of the
             Science of Food and Agriculture, 92(12), 2397-2408.

r34       ElMasry, G., Barbin, D. F., Sun, D. W.,& Allen, P. (2012).
              Meat quality evaluation by hyperspectral imaging
          technique: an overview. Critical Reviews in Food Science
                        and Nutrition, 52(8), 689-711.

r35         Martin, C. K., Nicklas, T., Gunturk, B., Correa, J. B.,
            Allen, H. R., & Champagne, C. (2014). Measuring food
              intake with digital photography. Journal of Human
                     Nutrition and Dietetics, 27, 72-81.

r36        Sharp, D. B., & Allman-Farinelli, M. (2014). Feasibility
           and validity of mobile phones to assess dietary intake.
                       Nutrition, 30(11-12), 1257-1266.

r37          Ma, J., Sun, D. W., Qu, J. H., Liu, D., Pu, H., Gao,
         W. H., & Zeng, X.A. (2016). Applications of computer vision
          for assessing quality of agri-food products: a review of
             recent research advances. Critical reviews in food
                    science and nutrition, 56(1), 113-127.

r38          Pu, Y. Y., Feng, Y. Z., & Sun, D. W. (2015). Recent
           progress of hyperspectral imaging on quality and safety
               inspection of fruits and vegetables: a review.
           Comprehensive Reviews in Food Science and Food Safety,
                               14(2), 176-188.

r39         Devi, P. V., & Vijayarekha, K. (2014). Machine vision
                applications to locate fruits, detect defects
         and remove noise: a review. Rasayan J. Chem, 7(1), 104-113.

r40       Zhang, B., Huang, W., Li, J., Zhao, C., Fan, S., Wu, J.,
               & Liu, C. (2014). Principles, developments and
            applications of computer vision for external quality
             inspection of fruits and vegetables: A review. Food
                     Research International, 62, 326-343.

r41       Dowlati, M., de la Guardia, M., & Mohtasebi, S. S. (2012).
          Application of machine-vision techniques to fish-quality
              assessment. TrAC Trends in Analytical Chemistry,
                                 40, 168-179.

r42            Rady, A.M., & Guyer, D. E. (2015). Rapid and/or
          nondestructive quality evaluation methods for potatoes: A
              review. Computers and electronics in agriculture,
                                 117, 31-48.

r43        Steele, R. (2015). An overview of the state of the art
             of automated capture of dietary intake information.
               Critical reviews in food science and nutrition,
                              55(13), 1929-1938.

r44          Mahajan, S., Das, A., & Sardana, H. K. (2015). Image
          acquisition techniques for assessment of legume quality.
             Trends in Food Science & Technology, 42(2), 116-133.

r45       Zhang, B., Huang, W., Li, J., Zhao, C., Fan, S., Wu, J.,
               & Liu, C. (2014). Principles, developments and
            applications of computer vision for external quality
             inspection of fruits and vegetables: A review. Food
                     Research International, 62, 326-343.

r46             Banerjee, R., Tudu, B., Bandyopadhyay, R., &
             Bhattacharyya, N. (2016). A review on combined odor
           and taste sensor systems. Journal of Food Engineering,
                                 190, 10-21.

r47          Beltran Ortega, J., Martinez Gila, D. M., Aguilera
          Puerto, D., Gamez Garcia, J., & Gomez Ortega, J. (2016).
          Novel technologies for monitoring the in-line quality of
             virgin olive oil during manufacturing and storage.
               Journal of the Science of Food and Agriculture,
                              96(14), 4644-4662.

r48          Jha, S. N., Jaiswal, P., Grewal, M.K., Gupta, M., &
              Bhardwaj, R. (2016). Detection of adulterants and
          contaminants in liquid foods--a review. Critical reviews
              in food science and nutrition, 56(10), 1662-1684.

r49        Wang, Y., Li, Y., Yang, J., Ruan, J., & Sun, C. (2016).
         Microbial volatile organic compounds and their application
          in microorganism identification in foodstuff. TrAC Trends
                      in Analytical Chemistry, 78, 1-16.

r50          Balasubramanian, S., Amamcharla, J., & Shin, J. E.
           (2016). Possible application of electronic nose systems
            for meat safety: An overview. In Electronic Noses and
                     Tongues in Food Science(pp. 59-71).

r51       Wisniewska, P., Dymerski, T., Wardencki, W., & Namiesnik,
                J. (2015). Chemical composition analysis and
          authentication of whisky. Journal of the Science of Food
                     and Agriculture, 95(11), 2159-2166.

r52        Loutfi, A., Coradeschi, S., Mani, G. K., Shankar, P., &
            Rayappan, J. B. B. (2015). Electronic noses for food
               quality: A review. Journal of Food Engineering,
                                144, 103-111.

r53        Sliwinska, M.,Wisniewska, P., Dymerski, T., Namiesnik,
          J., & Wardencki,W. (2014). Food analysis using artificial
             senses. Journal of agricultural and food chemistry,
                              62(7), 1423-1448.

r54             Peris, M., & Escuder-Gilabert, L. (2013). On-
             linemonitoring of food fermentation processes using
             electronic noses and electronic tongues: a review.
                     Analytica chimica acta, 804, 29-36.

r55        Smyth, H., & Cozzolino, D. (2012). Instrumental methods
           (spectroscopy, electronic nose, and tongue) as tools to
            predict taste and aroma in beverages: advantages and
              limitations. Chemical reviews, 113(3), 1429-1440.

r56          Ponzoni, A., Comini, E., Concina, I., Ferroni, M.,
          Falasconi, M., Gobbi, E., Sberveglieri, V., Sberveglieri,
            G. (2012). Nanostructuredmetal oxide gas sensors, a
          survey of applications carried out at sensor lab, Brescia
          (Italy) in the security and food quality fields. Sensors,
                             12(12), 17023-17045.

r57         Ceto, X., Voelcker,N. H., & Prieto-Simon, B. (2016).
            Bioelectronic tongues: New trends and applications in
           water and food analysis. Biosensors and bioelectronics,
                                 79, 608-626.

r58          Wadehra, A., & Patil, P. S. (2016). Application of
              electronic tongues in food processing. Analytical
                           Methods, 8(3), 474-480.

r59         Alessio, P., Constantino, C. J. L., Daikuzono, C. M.,
          Riul, A., & deOliveira,O. N. (2016). Analysis of Coffees
          Using Electronic Tongues. In Electronic Noses and Tongues
                        in Food Science (pp. 171-177).

r60         Jimenez-Jorquera, C., & Gutierrez-Capitan, M. (2016).
             Electronic Tongues Applied to Grape and Fruit Juice
              Analysis. In Electronic Noses and Tongues in Food
                            Science (pp. 189-198).

r61        Ha, D., Sun, Q., Su, K., Wan, H., Li, H., Xu, N., Sun,
               F., Zhuang, L., Hu, N., Wang, P. (2015). Recent
             achievements in electronic tongue and bioelectronic
              tongue as taste sensors. Sensors and Actuators B:
                          Chemical, 207, 1136-1146.

r62         Tahara, Y., & Toko, K. (2013). Electronic tongues--a
               review. IEEE Sensors Journal, 13(8), 3001-3011.

r63       Yasuura, M., & Toko, K. (2015). Review of development of
             sweetness sensor. IEEJ Transactions on Sensors and
                        Micromachines, 135(2), 51-56.

r64        Gutierrez-Capitan, M., Capdevila, F., Vila-Planas, J.,
                Domingo, C., Buttgenbach, S., Llobera, A.,
            Puig-Pujol, A., Jimenez-Jorquera, C. (2014).
              Hybrid electronic tongues applied to the quality
                control of wines. Journal of Sensors, 2014.

r65        Chen, Z., Wu, J., Zhao, Y., Xu, F., & Hu, Y. (2012).
                Recent advances in bitterness evaluation
              methods. Analytical Methods, 4(3), 599-608.

r66        Latha, R. S., & Lakshmi, P. K. (2012). Electronic
             tongue: an analytical gustatory tool. Journal of
            advanced pharmaceutical technology & research,
                        3(1), 3.

r67        Rateni, G., Dario, P., & Cavallo, F. (2017).
            Smartphone-based food diagnostic technologies:
                  a review. Sensors, 17(6), 1453.


https://doi.org/10.1155/2018/8608407

Data Availability

All data generated or analysed during this study are included in this article.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

References

[1] European Commission, "ICT research area on "Collective Awareness Platforms for Sustainability and Social Innovation"," http://ec.europa.eu/information_society/activities/collectiveawareness, 2012.

[2] F. Bellini, A. Passani, M. Klitsi, and W. Vanobberger, Exploring Impacts of Collective Awareness Platforms for Sustainability and Social Innovation, Eurokeis Press, Rome, Italy, 2016.

[3] F. Sestini, "Collective awareness platforms: Engines for sustainability and ethics," IEEE Technology and Society Magazine, vol. 31, no. 4, pp. 54-62, 2012.

[4] A. P. Volpentesta, A. M. Felicetti, and S. Ammirato, "Intelligent Food Information Provision to Consumers in an Internet of Food Era," in Collaboration in a Data-Rich World, vol. 506 of IFIP Advances in Information and Communication Technology, pp. 725-736, Springer International Publishing, 2017.

[5] L. Atzori, A. Iera, G. Morabito, and M. Nitti, "The social internet of things (SIoT)--when social networks meet the internet of things: concept, architecture and network characterization," Computer Networks, vol. 56, no. 16, pp. 3594-3608, 2012.

[6] C. Evers, R. Kniewel, K. Geihs, and L. Schmidt, "The user in the loop: enabling user participation for self-adaptive applications," Future Generation Computer Systems, vol. 34, pp. 110-123, 2014.

[7] A. Botta, W. de Donato, V. Persico, and A. Pescape, "Integration of cloud computing and internet of things: a survey," Future Generation Computer Systems, vol. 56, pp. 684-700, 2016.

[8] J. Surowiecki, The Wisdom of Crowds, Doubleday, 2004.

[9] N. Bicocchi, A. Cecaj, D. Fontana, M. Mamei, A. Sassi, and F. Zambonelli, "Collective awareness for human-ICT collaboration in smart cities," in Proceedings of the IEEE 22nd International Workshop on Enabling Technologies: Infrastructure for Collaborative Enterprises (WETICE '13), pp. 3-8, 2013.

[10] C.-W. Tsai, C.-F. Lai, and A. V. Vasilakos, "Future internet of things: open issues and challenges," Wireless Networks, vol. 20, no. 8, pp. 2201-2217, 2014.

[11] K. Zia, D. K. Saini, A. Muhammad, and A. Ferscha, "Nature-Inspired Computational Model of Population Desegregation Under Group Leaders Influence," IEEE Transactions on Computational Social Systems, vol. 5, no. 2, pp. 532-543, 2018.

[12] E. Bonabeau, M. Dorigo, and G. Theraulaz, Swarm Intelligence: from Natural to Artificial Systems, Oxford University Press, London, UK, 1998.

[13] N. Bicocchi, D. Fontana, M. Mamei, and F. Zambonelli, "Collective awareness and action in urban superorganisms," in Proceedings of the 2013 IEEE International Conference on Communications Workshops, (ICC '13), pp. 194-198, 2013.

[14] X.-S. Yang and M. Karamanoglu, "Swarm Intelligence and Bio-Inspired Computation: An Overview," Swarm Intelligence and Bio-Inspired Computation, pp. 3-23, 2013.

[15] P. Arpaia, C. Manna, G. Montenero, and G. D'Addio, "In-time prognosis based on swarm intelligence for home-care monitoring: A case study on pulmonary disease," IEEE Sensors Journal, vol. 12, no. 3, pp. 692-698, 2012.

[16] M. Kumawat, N. Gupta, N. Jain, and R. C. Bansal, "Swarm-Intelligence-Based Optimal Planning of Distributed Generators in Distribution Network for Minimizing Energy Loss," Electric Power Components and Systems, vol. 45, no. 6, pp. 589-600, 2017

[17] D. Pradeepkumar and V. Ravi, "Forecasting financial time series volatility using Particle Swarm Optimization trained Quantile Regression Neural Network," Applied Soft Computing, vol. 58, pp. 35-52, 2017.

[18] A. P. Volpentesta and A. M. Felicetti, "Research Investigation on Food Information User's Behaviour," in Collaborative Networks of Cognitive Systems, L. M. Camarinha-Matos, H. Afsarmanesh, and Y. Rezgui, Eds., vol. 534 of IFIP Advances in Information and Communication Technology, pp. 190-202, Springer International Publishing, Heidelberg, Germany, 2018.

[19] F. J. Xu, V. P. Zhao, L. Shan, and C. Huang, "A Framework for Developing Social Networks Enabling Systems to Enhance the Transparency and Visibility of Cross-border Food Supply Chains," GSTF Journal on Computing (JoC), vol. 3, no. 4, p. 132, 2014.

[20] B. Smith and W. Ceusters, "Aboutness: Towards foundations for the information artifact ontology," in Proceedings of the Sixth International Conference on Biomedical Ontology (ICBO '15), pp. 47-51, Lisbon, Portugal, 2015.

[21] L.-G. Zhang, X. Zhang, L.-J. Ni, Z.-B. Xue, X. Gu, and S.-X. Huang, "Rapid identification of adulterated cow milk by nonlinear pattern recognition methods based on near infrared spectroscopy," Food Chemistry, vol. 145, pp. 342-348, 2014.

[22] V Corvello and A. M. Felicetti, "Factors affecting the utilization of knowledge acquired by researchers from scientific social networks: An empirical analysis," Knowledge Management, vol. 13, no. 3, pp. 15-26, 2014.

[23] K. Toko, "Taste sensor," Sensors and Actuators B: Chemical, vol. 64, no. 1-3, pp. 205-215, 2000.

[24] L. Franca, A. Lopez-Lopez, R. Rossello-Mora, and M. S. da Costa, "Microbial diversity and dynamics of a groundwater and a still bottled natural mineral water," Environmental Microbiology, vol. 17, no. 3, pp. 577-593, 2015.

[25] T. Vu, F. Lin, N. Alshurafa, and W. Xu, "Wearable Food Intake Monitoring Technologies: A Comprehensive Review," The Computer Journal, vol. 6, no. 1, p. 4, 2017

[26] B. Kitchenham, Procedures for Performing Systematic Reviews, Keele University, Keele, UK, 2004.

[27] D. Cozzolino, "The role of vibrational spectroscopy as a tool to assess economically motivated fraud and counterfeit issues in agricultural products and foods," Analytical Methods, vol. 7, no. 22, pp. 9390-9400, 2015.

[28] G. Downey, Advances in Food Authenticity Testing, Woodhead Publishing Series in Food Science, Technology and Nutrition, 2016.

[29] M. Casale and R. Simonetti, "Review: Near infrared spectroscopy for analysing olive oils," Journal of Near Infrared Spectroscopy, vol. 22, no. 2, pp. 59-80, 2014.

[30] R. Steele, "An Overview of the State of the Art of Automated Capture of Dietary Intake Information," Critical Reviews in Food Science and Nutrition, vol. 55, no. 13, pp. 1929-1938, 2015.

[31] F. Kong and J. Tan, "DietCam: Automatic dietary assessment with mobile camera phones," Pervasive and Mobile Computing, vol. 8, no. 1, pp. 147-163, 2012.

[32] Y. Kawano and K. Yanai, "FoodCam: A real-time food recognition system on a smartphone," Multimedia Tools and Applications, vol. 74, no. 14, pp. 5263-5287, 2015.

[33] O. Beijbom, N. Joshi, D. Morris, S. Saponas, and S. Khullar, "Menu-match: Restaurant-specific food logging from images," in Proceedings of the 2015 IEEE Winter Conference on Applications of Computer Vision, (WACV '15), pp. 844-851, IEEE, 2015.

[34] S. Lohumi, S. Lee, H. Lee, and B.-K. Cho, "A review of vibrational spectroscopic techniques for the detection of food authenticity and adulteration," Trends in Food Science & Technology, vol. 46, no. 1, pp. 85-98, 2015.

[35] Y.-Y. Pu, Y.-Z. Feng, and D.-W. Sun, "Recent progress of hyperspectral imaging on quality and safety inspection of fruits and vegetables: A review," Comprehensive Reviews in Food Science and Food Safety, vol. 14, no. 2, pp. 176-188, 2015.

[36] M. Peris and L. Escuder-Gilabert, "A 21st century technique for foodcontrol: electronic noses," Analytica Chimica Acta,vol. 638, no. 1, pp. 1-15, 2009.

[37] J. E. Fitzgerald, E. T. H. Bui, N. M. Simon, and H. Fenniri, "Artificial Nose Technology: Status and Prospects in Diagnostics," Trends in Biotechnology, vol. 35, no. 1, pp. 33-42, 2017

[38] A. D. Wilson and M. Baietto, "Applications and advances in electronic-nose technologies," Sensors, vol. 9, no. 7, pp. 5099-5148, 2009.

[39] S. N. Jha, P. Jaiswal, M. K. Grewal, M. Gupta, and R. Bhardwaj, "Detection of Adulterants and Contaminants in Liquid Foods-A Review," Critical Reviews in Food Science and Nutrition, vol. 56, no. 10, pp. 1662-1684, 2016.

[40] K. Toko, Y. Tahara, M. Habara, Y. Kobayashi, and H. Ikezaki, "Taste Sensor: Electronic Tongue with Global Selectivity," Essentials of Machine Olfaction and Taste, pp. 87-174, 2016.

[41] R. Banerjee, B. Tudu, R. Bandyopadhyay, and N. Bhattacharyya, "A review on combined odor and taste sensor systems," Journal of Food Engineering, vol. 190, pp. 10-21, 2016.

[42] M. Peris and L. Escuder-Gilabert, "Electronic noses and tongues to assess food authenticity and adulteration," Trends in Food Science & Technology, vol. 58, pp. 40-54, 2016.

Antonio P. Volpentesta, Alberto M. Felicetti (iD), and Nicola Frega

Department of Mechanical Energy and Management Engineering, University of Calabria, Arcavacata di Rende (CS), 87036, Italy

Correspondence should be addressed to Alberto M. Felicetti; alberto.felicetti@unical.it

Received 3 August 2018; Accepted 30 October 2018; Published 2 December 2018

Guest Editor: Kashif Zia

Caption: Figure 1: A representation of the Collective process for generating reliable food information.

Caption: Figure 2: A three-tier architecture for the CFA platform.

Caption: Figure 3: Systematic literature review workflow.
Table 1: The set T of technical approaches.

Technical Approaches

[t.sub.1]: near infrared spectroscopy
[t.sub.2]: mid infrared spectroscopy
[t.sub.3]: raman spectroscopy
[t.sub.4]: fluorescence spectroscopy
[t.sub.5]: camera image sensors
[t.sub.6]: hyperspectral imaging
[t.sub.7]: Gas Gravimetric sensors
[t.sub.8]: Gas Biosensors
[t.sub.9]: Gas Electrochem sensors
[t.sub.10]: Gas Optical sensors
[t.sub.11]: Solids and Liquids Gravimetric sensors
[t.sub.12]: Solids and Liquids biosensors
[t.sub.13]: Solids and Liquids Electrochem sensors
[t.sub.14]: Solids and Liquids Optical sensors

Table 2: The set C of food characteristics.

          Food Characteristics

Index          Name             Description

[c.sub.1]   microbial          food kinetic
            properties       properties that
                             can be measured
                               by microbial
                             detection (e.g.,
                             the total count
                             of microorganism
                           in a sample of food)

[c.sub.2]    chemical          food kinetic
            properties       properties that
                            can be chemically
                             detected (e.g.,
                               pH value and
                              total volatile
                             basic nitrogen);

[c.sub.3]    chemical      chemical compounds'
             compounds      properties (e.g.,
                           concentration level);

[c.sub.4]    surface        visible attributes
            conditions        describing the
                          physical outer aspect
                           of a food item (such
                            as colour, shape);

[c.sub.5]   mass-volume     physical properties
             related         of a food sample
            properties        (e.g., weight
                             and the volume);

[c.sub.6]    volatile      compounds of organic
             organic         vapours or gases
             compounds      released into the
                              air by solid or
                               liquid foods.

Table 3: The set P of food performances.

Food Performances

Index               Name          Description

[p.sub.1]   freshness/ spoilage   spoilage/edible level
                                  of a food product

[p.sub.2]          hazard         degree of hazard, e.g.,
                                  presence of illegal
                                  ingredients or treatments
                                  contaminating or poisoning
                                  a food product

[p.sub.3]       ingredients       edible substances in a
                                  dish or a food product

[p.sub.4]         category        food group (e.g. fruits,
                                  dairy, meat, fish) or food
                                  type (e.g. apple, orange,
                                  apricot) which a food
                                  sample belongs to

[p.sub.5]         variety         variety of a food sample
                                  belonging to a food type
                                  (e.g. sunstar orange,
                                  belladonna orange,
                                  tarocco orange)

[p.sub.6]        nutrients        nutrients (protein,
                                  carbs, fat, calories,
                                  vitamins, minerals) and
                                  their quantities in a
                                  food sample

[p.sub.7]     taste perception    level of tastes (e.g.,
                                  sourness, saltiness,
                                  umami, bitterness,
                                  and sweetness)

[p.sub.8]      quality grade      quality assessment a food
                                  sample according to
                                  some standardised
                                  grading system

[p.sub.9]   geographical origin   geographical area where
                                  a food sample has been
                                  originate, according to
                                  some geographical
                                  classification

[p.sub.10]      adulteration      presence and quantities
                                  of improper substances
                                  in a food product

Table 4: Technological research patterns and related works.

Technical          Food                  Food
Approach      Characteristics        Performances

[t.sub.1]       [c.sub.1],      [p.sub.1], [p.sub.2],
                [c.sub.2],      [p.sub.6], [p.sub.8],
                [c.sub.3]             [p.sub.10]

[t.sub.2]       [c.sub.2],      [p.sub.1], [p.sub.2],
                [c.sub.3]       [p.sub.3], [p.sub.4],
                                [p.sub.8], [p.sub.9],
                                      [p.sub.10]

[t.sub.3]       [c.sub.1],      [p.sub.1], [p.sub.3],
                [c.sub.2],       [p.sub.8], [p.sub.10]
                [c.sub.3]

[t.sub.4]       [c.sub.1],       [p.sub.1], [p.sub.10]
                [c.sub.2]

[t.sub.5]       [c.sub.4],      [p.sub.1], [p.sub.5],
                [c.sub.5]        [p.sub.6], [p.sub.8]

[t.sub.6]       [c.sub.1],      [p.sub.1], [p.sub.5],
                [c.sub.3],      [p.sub.6], [p.sub.8],
                [c.sub.4],            [p.sub.10]
                [c.sub.5]

[t.sub.7]       [c.sub.6]        [p.sub.1], [p.sub.10]

[t.sub.8]       [c.sub.6]        [p.sub.1], [p.sub.10]

[t.sub.9]       [c.sub.1],      [p.sub.1], [p.sub.4],
                [c.sub.6]       [p.sub.8], [p.sub.9],
                                      [p.sub.10]

[t.sub.10]      [c.sub.6]        [p.sub.1], [p.sub.10]

[t.sub.11]      [c.sub.3]        [p.sub.1], [p.sub.7]

[t.sub.12]      [c.sub.3]       [p.sub.1], [p.sub.5],
                                       [p.sub.7]

[t.sub.13]      [c.sub.1],      [p.sub.1], [p.sub.2],
                [c.sub.3]       [p.sub.4], [p.sub.7],
                                 [p.sub.9], [p.sub.10]

[t.sub.14]      [c.sub.1],      [p.sub.1], [p.sub.7],
                [c.sub.3]        [p.sub.9], [p.sub.10]

Technical             Review
Approach              Papers

[t.sub.1]         [r1] [r2] [r3]
                  [r6] [r7] [r8]
                 [r9] [r10] [r11]
                [r12] [r13] [r14]
                [r15] [r16] [r17]
                [r18] [r19] [r20]
                [r22] [r23] [r24]
                [r26] [r25] [r28]
              [r31] [r32] [r55] [r67]

[t.sub.2]        [r2] [r10] [r11]
                [r12] [r13] [r14]
                [r17] [r19] [r21]
                 [r23] [r31] [r32]

[t.sub.3]         [r5] [r8] [r9]
                [r11] [r12] [r13]
                [r14] [r21] [r23]
                    [r25] [r32]

[t.sub.4]        [r8] [r11] [r12]

[t.sub.5]        [r4] [r27] [r35]
                [r36] [r37] [r39]
                 [r40] [r42] [r44]
              [r41] [r43] [r45] [r67]

[t.sub.6]        [r9] [r12] [r13]
                [r14] [r15] [r16]
                [r18] [r21] [r23]
                [r27] [r28] [r29]
              [r30] [r33] [r34] [r38]

[t.sub.7]              [r46]

[t.sub.8]              [r46]

[t.sub.9]       [r46] [r47] [r48]
                [r49] [r50] [r51]
                 [r52] [r53] [r54]
                    [r55] [r56]

[t.sub.10]       [r47] [r48] [r49]

[t.sub.11]    [r53] [r62] [r63] [r65]

[t.sub.12]       [r62] [r63] [r65]

[t.sub.13]      [r54] [r55] [r56]
                [r57] [r58] [r60]
                 [r61] [r62] [r63]
                    [r64] [r66]

[t.sub.14]      [r46] [r53] [r58]
              [r62] [r63] [r64] [r65]
COPYRIGHT 2018 Hindawi Limited
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2018 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Research Article
Author:Volpentesta, Antonio P.; Felicetti, Alberto M.; Frega, Nicola
Publication:Advances in Human-Computer Interaction
Article Type:Report
Date:Jan 1, 2018
Words:10785
Previous Article:A New PC-Based Text Entry System Based on EOG Coding.
Next Article:A Text-Based Chat System Embodied with an Expressive Agent.
Topics:

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters