Automatisation n8n : détection d'anomalies dans les cultures
Ce workflow n8n est conçu pour automatiser la détection d'anomalies dans les ensembles de données de cultures, offrant ainsi aux agriculteurs et aux chercheurs une solution efficace pour surveiller la santé des cultures. En utilisant des outils d'analyse avancés, ce workflow permet de traiter des données complexes et d'identifier rapidement des anomalies, ce qui est crucial pour la gestion des cultures et la prise de décision. Le processus commence par un déclencheur qui exécute le workflow, suivi de plusieurs étapes clés. Tout d'abord, une image est intégrée via une requête HTTP pour analyser les données visuelles des cultures. Ensuite, le workflow utilise des requêtes HTTP pour obtenir des similarités entre les médoïdes, ce qui permet de comparer les scores des différentes cultures. Les notes autocollantes sont utilisées pour documenter les résultats à chaque étape, fournissant une visualisation claire des informations. En fin de compte, ce workflow offre une valeur ajoutée significative en réduisant le temps nécessaire pour détecter des problèmes potentiels dans les cultures, permettant ainsi aux utilisateurs de réagir rapidement et d'optimiser leurs rendements. Grâce à cette automatisation n8n, les entreprises agricoles peuvent améliorer leur efficacité opérationnelle et réduire les risques liés à la santé des cultures.
Workflow n8n agriculture, détection d'anomalies, analyse de données : vue d'ensemble
Schéma des nœuds et connexions de ce workflow n8n, généré à partir du JSON n8n.
Workflow n8n agriculture, détection d'anomalies, analyse de données : détail des nœuds
Inscris-toi pour voir l'intégralité du workflow
Inscription gratuite
S'inscrire gratuitementBesoin d'aide ?{
"id": "G8jRDBvwsMkkMiLN",
"meta": {
"instanceId": "205b3bc06c96f2dc835b4f00e1cbf9a937a74eeb3b47c99d0c30b0586dbf85aa"
},
"name": "[3/3] Anomaly detection tool (crops dataset)",
"tags": [
{
"id": "spMntyrlE9ydvWFA",
"name": "anomaly-detection",
"createdAt": "2024-12-08T22:05:15.945Z",
"updatedAt": "2024-12-09T12:50:19.287Z"
}
],
"nodes": [
{
"id": "e01bafec-eb24-44c7-b3c4-a60f91666350",
"name": "Sticky Note1",
"type": "n8n-nodes-base.stickyNote",
"position": [
-1200,
180
],
"parameters": {
"color": 6,
"width": 400,
"height": 740,
"content": "We are working here with crops dataset: \nExisting (so not anomalies) crops images in dataset are:\n- 'pearl_millet(bajra)',\n- 'tobacco-plant',\n- 'cherry',\n- 'cotton',\n- 'banana',\n- 'cucumber',\n- 'maize',\n- 'wheat',\n- 'clove',\n- 'jowar',\n- 'olive-tree',\n- 'soyabean',\n- 'coffee-plant',\n- 'rice',\n- 'lemon',\n- 'mustard-oil',\n- 'vigna-radiati(mung)',\n- 'coconut',\n- 'gram',\n- 'pineapple',\n- 'sugarcane',\n- 'sunflower',\n- 'chilli',\n- 'fox_nut(makhana)',\n- 'jute',\n- 'papaya',\n- 'tea',\n- 'cardamom',\n- 'almond'\n"
},
"typeVersion": 1
},
{
"id": "b9943781-de1f-4129-9b81-ed836e9ebb11",
"name": "Embed image",
"type": "n8n-nodes-base.httpRequest",
"position": [
680,
60
],
"parameters": {
"url": "https://api.voyageai.com/v1/multimodalembeddings",
"method": "POST",
"options": {},
"jsonBody": "={{\n{\n \"inputs\": [\n {\n \"content\": [\n {\n \"type\": \"image_url\",\n \"image_url\": $('Image URL hardcode').first().json.imageURL\n }\n ]\n }\n ],\n \"model\": \"voyage-multimodal-3\",\n \"input_type\": \"document\"\n}\n}}",
"sendBody": true,
"specifyBody": "json",
"authentication": "genericCredentialType",
"genericAuthType": "httpHeaderAuth"
},
"credentials": {
"httpHeaderAuth": {
"id": "Vb0RNVDnIHmgnZOP",
"name": "Voyage API"
}
},
"typeVersion": 4.2
},
{
"id": "47b72bc2-4817-48c6-b517-c1328e402468",
"name": "Get similarity of medoids",
"type": "n8n-nodes-base.httpRequest",
"position": [
940,
60
],
"parameters": {
"url": "={{ $('Variables for medoids').first().json.qdrantCloudURL }}/collections/{{ $('Variables for medoids').first().json.collectionName }}/points/query",
"method": "POST",
"options": {},
"jsonBody": "={{\n{\n \"query\": $json.data[0].embedding,\n \"using\": \"voyage\",\n \"limit\": $('Info About Crop Labeled Clusters').first().json.cropsNumber,\n \"with_payload\": true,\n \"filter\": {\n \"must\": [\n { \n \"key\": $('Variables for medoids').first().json.clusterCenterType,\n \"match\": {\n \"value\": true\n }\n }\n ]\n }\n}\n}}",
"sendBody": true,
"specifyBody": "json",
"authentication": "predefinedCredentialType",
"nodeCredentialType": "qdrantApi"
},
"credentials": {
"qdrantApi": {
"id": "it3j3hP9FICqhgX6",
"name": "QdrantApi account"
}
},
"typeVersion": 4.2
},
{
"id": "42d7eb27-ec38-4406-b5c4-27eb45358e93",
"name": "Compare scores",
"type": "n8n-nodes-base.code",
"position": [
1140,
60
],
"parameters": {
"language": "python",
"pythonCode": "points = _input.first()['json']['result']['points']\nthreshold_type = _('Variables for medoids').first()['json']['clusterThresholdCenterType']\n\nmax_score = -1\ncrop_with_max_score = None\nundefined = True\n\nfor center in points:\n if center['score'] >= center['payload'][threshold_type]:\n undefined = False\n if center['score'] > max_score:\n max_score = center['score']\n crop_with_max_score = center['payload']['crop_name']\n\nif undefined:\n result_message = \"ALERT, we might have a new undefined crop!\"\nelse:\n result_message = f\"Looks similar to {crop_with_max_score}\"\n\nreturn [{\n \"json\": {\n \"result\": result_message\n }\n}]\n"
},
"typeVersion": 2
},
{
"id": "23aa604a-ff0b-4948-bcd5-af39300198c0",
"name": "Sticky Note4",
"type": "n8n-nodes-base.stickyNote",
"position": [
-1200,
-220
],
"parameters": {
"width": 400,
"height": 380,
"content": "## Crop Anomaly Detection Tool\n### This is the tool that can be used directly for anomalous crops detection. \nIt takes as input (any) **image URL** and returns a **text message** telling if whatever this image depicts is anomalous to the crop dataset stored in Qdrant. \n\n* An Image URL is received via the Execute Workflow Trigger which is used to generate embedding vectors via the Voyage.ai Embeddings API.\n* The returned vectors are used to query the Qdrant collection to determine if the given crop is known by comparing it to **threshold scores** of each image class (crop type).\n* If the image scores lower than all thresholds, then the image is considered an anomaly for the dataset."
},
"typeVersion": 1
},
{
"id": "3a79eca2-44f9-4aee-8a0d-9c7ca2f9149d",
"name": "Variables for medoids",
"type": "n8n-nodes-base.set",
"position": [
-200,
60
],
"parameters": {
"options": {},
"assignments": {
"assignments": [
{
"id": "dbbc1e7b-c63e-4ff1-9524-8ef3e9f6cd48",
"name": "clusterCenterType",
"type": "string",
"value": "is_medoid"
},
{
"id": "a994ce37-2530-4030-acfb-ec777a7ddb05",
"name": "qdrantCloudURL",
"type": "string",
"value": "https://152bc6e2-832a-415c-a1aa-fb529f8baf8d.eu-central-1-0.aws.cloud.qdrant.io"
},
{
"id": "12f0a9e6-686d-416e-a61b-72d034ec21ba",
"name": "collectionName",
"type": "string",
"value": "=agricultural-crops"
},
{
"id": "4c88a617-d44f-4776-b457-8a1dffb1d03c",
"name": "clusterThresholdCenterType",
"type": "string",
"value": "is_medoid_cluster_threshold"
}
]
}
},
"typeVersion": 3.4
},
{
"id": "13b25434-bd66-4293-93f1-26c67b9ec7dd",
"name": "Sticky Note3",
"type": "n8n-nodes-base.stickyNote",
"position": [
-340,
260
],
"parameters": {
"color": 6,
"width": 360,
"height": 200,
"content": "**clusterCenterType** - either\n* \"is_text_anchor_medoid\" or\n* \"is_medoid\"\n\n\n**clusterThresholdCenterType** - either\n* \"is_text_anchor_medoid_cluster_threshold\" or\n* \"is_medoid_cluster_threshold\""
},
"typeVersion": 1
},
{
"id": "869b0962-6cae-487d-8230-539a0cc4c14c",
"name": "Info About Crop Labeled Clusters",
"type": "n8n-nodes-base.set",
"position": [
440,
60
],
"parameters": {
"options": {},
"assignments": {
"assignments": [
{
"id": "5327b254-b703-4a34-a398-f82edb1d6d6b",
"name": "=cropsNumber",
"type": "number",
"value": "={{ $json.result.hits.length }}"
}
]
}
},
"typeVersion": 3.4
},
{
"id": "5d3956f8-f43b-439e-b176-a594a21d8011",
"name": "Total Points in Collection",
"type": "n8n-nodes-base.httpRequest",
"position": [
40,
60
],
"parameters": {
"url": "={{ $json.qdrantCloudURL }}/collections/{{ $json.collectionName }}/points/count",
"method": "POST",
"options": {},
"jsonBody": "={\n \"exact\": true\n}",
"sendBody": true,
"specifyBody": "json",
"authentication": "predefinedCredentialType",
"nodeCredentialType": "qdrantApi"
},
"credentials": {
"qdrantApi": {
"id": "it3j3hP9FICqhgX6",
"name": "QdrantApi account"
}
},
"typeVersion": 4.2
},
{
"id": "14ba3db9-3965-4b20-b333-145616d45c3a",
"name": "Each Crop Counts",
"type": "n8n-nodes-base.httpRequest",
"position": [
240,
60
],
"parameters": {
"url": "={{ $('Variables for medoids').first().json.qdrantCloudURL }}/collections/{{ $('Variables for medoids').first().json.collectionName }}/facet",
"method": "POST",
"options": {},
"jsonBody": "={{\n{\n \"key\": \"crop_name\",\n \"limit\": $json.result.count,\n \"exact\": true\n}\n}}",
"sendBody": true,
"specifyBody": "json",
"authentication": "predefinedCredentialType",
"nodeCredentialType": "qdrantApi"
},
"credentials": {
"qdrantApi": {
"id": "it3j3hP9FICqhgX6",
"name": "QdrantApi account"
}
},
"typeVersion": 4.2
},
{
"id": "e37c6758-0556-4a56-ab14-d4df663cb53a",
"name": "Image URL hardcode",
"type": "n8n-nodes-base.set",
"position": [
-480,
60
],
"parameters": {
"options": {},
"assignments": {
"assignments": [
{
"id": "46ceba40-fb25-450c-8550-d43d8b8aa94c",
"name": "imageURL",
"type": "string",
"value": "={{ $json.query.imageURL }}"
}
]
}
},
"typeVersion": 3.4
},
{
"id": "b24ad1a7-0cf8-4acc-9c18-6fe9d58b10f2",
"name": "Execute Workflow Trigger",
"type": "n8n-nodes-base.executeWorkflowTrigger",
"position": [
-720,
60
],
"parameters": {},
"typeVersion": 1
},
{
"id": "50424f2b-6831-41bf-8de4-81f69d901ce1",
"name": "Sticky Note2",
"type": "n8n-nodes-base.stickyNote",
"position": [
-240,
-80
],
"parameters": {
"width": 180,
"height": 120,
"content": "Variables to access Qdrant's collection we uploaded & prepared for anomaly detection in 2 previous pipelines\n"
},
"typeVersion": 1
},
{
"id": "2e8ed3ca-1bba-4214-b34b-376a237842ff",
"name": "Sticky Note5",
"type": "n8n-nodes-base.stickyNote",
"position": [
40,
-120
],
"parameters": {
"width": 560,
"height": 140,
"content": "These three nodes are needed just to figure out how many different classes (crops) we have in our Qdrant collection: **cropsNumber** (needed in *\"Get similarity of medoids\"* node. \n[Note] *\"Total Points in Collection\"* -> *\"Each Crop Counts\"* were used&explained already in *\"[2/4] Set up medoids (2 types) for anomaly detection (crops dataset)\"* pipeline.\n"
},
"typeVersion": 1
},
{
"id": "e2fa5763-6e97-4ff5-8919-1cb85a3c6968",
"name": "Sticky Note6",
"type": "n8n-nodes-base.stickyNote",
"position": [
620,
240
],
"parameters": {
"height": 120,
"content": "Here, we're embedding the image passed to this workflow tool with the Voyage embedding model to compare the image to all crop images in the database."
},
"typeVersion": 1
},
{
"id": "cdb6b8d3-f7f4-4d66-850f-ce16c8ed98b9",
"name": "Sticky Note7",
"type": "n8n-nodes-base.stickyNote",
"position": [
920,
220
],
"parameters": {
"width": 400,
"height": 180,
"content": "Checking how similar the image is to all the centres of clusters (crops).\nIf it's more similar to the thresholds we set up and stored in centres in the previous workflow, the image probably belongs to this crop class; otherwise, it's anomalous to the class. \nIf image is anomalous to all the classes, it's an anomaly."
},
"typeVersion": 1
},
{
"id": "03b4699f-ba43-4f5f-ad69-6f81deea2641",
"name": "Sticky Note22",
"type": "n8n-nodes-base.stickyNote",
"position": [
-620,
580
],
"parameters": {
"color": 4,
"width": 540,
"height": 300,
"content": "### For anomaly detection\n1. The first pipeline is uploading (crops) dataset to Qdrant's collection.\n2. The second pipeline sets up cluster (class) centres in this Qdrant collection & cluster (class) threshold scores.\n3. **This is the anomaly detection tool, which takes any image as input and uses all preparatory work done with Qdrant (crops) collection.**\n\n### To recreate it\nYou'll have to upload [crops](https://www.kaggle.com/datasets/mdwaquarazam/agricultural-crops-image-classification) dataset from Kaggle to your own Google Storage bucket, and re-create APIs/connections to [Qdrant Cloud](https://qdrant.tech/documentation/quickstart-cloud/) (you can use **Free Tier** cluster), Voyage AI API & Google Cloud Storage\n\n**In general, pipelines are adaptable to any dataset of images**\n"
},
"typeVersion": 1
}
],
"active": false,
"pinData": {
"Execute Workflow Trigger": [
{
"json": {
"query": {
"imageURL": "https://storage.googleapis.com/n8n-qdrant-demo/agricultural-crops%2Fcotton%2Fimage%20(36).jpg"
}
}
}
]
},
"settings": {
"executionOrder": "v1"
},
"versionId": "f67b764b-9e1a-4db0-b9f2-490077a62f74",
"connections": {
"Embed image": {
"main": [
[
{
"node": "Get similarity of medoids",
"type": "main",
"index": 0
}
]
]
},
"Each Crop Counts": {
"main": [
[
{
"node": "Info About Crop Labeled Clusters",
"type": "main",
"index": 0
}
]
]
},
"Image URL hardcode": {
"main": [
[
{
"node": "Variables for medoids",
"type": "main",
"index": 0
}
]
]
},
"Variables for medoids": {
"main": [
[
{
"node": "Total Points in Collection",
"type": "main",
"index": 0
}
]
]
},
"Execute Workflow Trigger": {
"main": [
[
{
"node": "Image URL hardcode",
"type": "main",
"index": 0
}
]
]
},
"Get similarity of medoids": {
"main": [
[
{
"node": "Compare scores",
"type": "main",
"index": 0
}
]
]
},
"Total Points in Collection": {
"main": [
[
{
"node": "Each Crop Counts",
"type": "main",
"index": 0
}
]
]
},
"Info About Crop Labeled Clusters": {
"main": [
[
{
"node": "Embed image",
"type": "main",
"index": 0
}
]
]
}
}
}Workflow n8n agriculture, détection d'anomalies, analyse de données : pour qui est ce workflow ?
Ce workflow s'adresse principalement aux agriculteurs, chercheurs et entreprises du secteur agricole qui cherchent à optimiser la gestion de leurs cultures. Il est également adapté aux équipes techniques ayant une compréhension intermédiaire des outils d'automatisation et d'analyse de données.
Workflow n8n agriculture, détection d'anomalies, analyse de données : problème résolu
Ce workflow résout le problème de la détection tardive des anomalies dans les cultures, ce qui peut entraîner des pertes de rendement significatives. En automatisant le processus d'analyse des données, il élimine les frustrations liées à la surveillance manuelle et réduit le risque de dommages aux cultures. Les utilisateurs peuvent ainsi obtenir des résultats concrets plus rapidement, leur permettant de prendre des décisions éclairées et d'agir proactivement.
Workflow n8n agriculture, détection d'anomalies, analyse de données : étapes du workflow
Étape 1 : Le workflow est déclenché manuellement.
- Étape 1 : Une image des cultures est intégrée via une requête HTTP pour analyse.
- Étape 2 : Les similarités entre les médoïdes sont obtenues à l'aide d'une autre requête HTTP.
- Étape 3 : Les scores des différentes cultures sont comparés à l'aide d'un code Python.
- Étape 4 : Les notes autocollantes sont utilisées pour documenter les résultats à chaque étape, fournissant une visualisation claire des informations.
- Étape 5 : Les points totaux dans la collection sont calculés pour avoir un aperçu global de la santé des cultures.
Workflow n8n agriculture, détection d'anomalies, analyse de données : guide de personnalisation
Pour personnaliser ce workflow, vous pouvez modifier l'URL de la requête HTTP pour intégrer vos propres images de cultures. Il est également possible d'ajuster les paramètres des requêtes pour affiner l'analyse des données. Pensez à adapter le contenu des notes autocollantes pour qu'il reflète vos résultats spécifiques. Enfin, vous pouvez ajouter d'autres outils d'analyse ou de visualisation pour enrichir le workflow et répondre à vos besoins spécifiques.