Automatisation n8n : détection d'anomalies sur les cultures
Ce workflow n8n a pour objectif de détecter des anomalies dans un ensemble de données concernant les cultures. Dans un contexte où l'agriculture de précision devient essentielle, ce type d'automatisation n8n permet aux agriculteurs et aux chercheurs de surveiller la santé des cultures en temps réel. En intégrant des outils d'analyse avancés, ce workflow facilite la prise de décision éclairée pour optimiser les rendements et réduire les pertes.
- Étape 1 : Le workflow commence par un nœud de type 'Sticky Note' pour introduire le contexte. Ensuite, l'étape 2 utilise une requête HTTP pour intégrer une image, suivie par l'étape 3 qui calcule la similarité des médoïdes à l'aide d'une autre requête HTTP. L'étape 4 compare les scores obtenus grâce à un code Python, tandis que l'étape 5 ajoute une autre note sticky pour des annotations. Les étapes suivantes incluent la définition des variables pour les médoïdes, l'analyse des clusters étiquetés, et le comptage total des points dans la collection. Chaque crop est analysé individuellement via des requêtes HTTP, permettant une évaluation précise des données. Ce processus aboutit à une visualisation claire des résultats, essentielle pour les décisions stratégiques. En intégrant ce workflow, les utilisateurs bénéficient d'une meilleure visibilité sur l'état de leurs cultures, réduisant ainsi les risques d'anomalies non détectées et améliorant la rentabilité de leurs opérations agricoles.
Workflow n8n agriculture, data analysis, anomalies : vue d'ensemble
Schéma des nœuds et connexions de ce workflow n8n, généré à partir du JSON n8n.
Workflow n8n agriculture, data analysis, anomalies : détail des nœuds
Inscris-toi pour voir l'intégralité du workflow
Inscription gratuite
S'inscrire gratuitementBesoin d'aide ?{
"id": "G8jRDBvwsMkkMiLN",
"meta": {
"instanceId": "205b3bc06c96f2dc835b4f00e1cbf9a937a74eeb3b47c99d0c30b0586dbf85aa"
},
"name": "[3/3] Anomaly detection tool (crops dataset)",
"tags": [
{
"id": "spMntyrlE9ydvWFA",
"name": "anomaly-detection",
"createdAt": "2024-12-08T22:05:15.945Z",
"updatedAt": "2024-12-09T12:50:19.287Z"
}
],
"nodes": [
{
"id": "e01bafec-eb24-44c7-b3c4-a60f91666350",
"name": "Sticky Note1",
"type": "n8n-nodes-base.stickyNote",
"position": [
-1200,
180
],
"parameters": {
"color": 6,
"width": 400,
"height": 740,
"content": "We are working here with crops dataset: \nExisting (so not anomalies) crops images in dataset are:\n- 'pearl_millet(bajra)',\n- 'tobacco-plant',\n- 'cherry',\n- 'cotton',\n- 'banana',\n- 'cucumber',\n- 'maize',\n- 'wheat',\n- 'clove',\n- 'jowar',\n- 'olive-tree',\n- 'soyabean',\n- 'coffee-plant',\n- 'rice',\n- 'lemon',\n- 'mustard-oil',\n- 'vigna-radiati(mung)',\n- 'coconut',\n- 'gram',\n- 'pineapple',\n- 'sugarcane',\n- 'sunflower',\n- 'chilli',\n- 'fox_nut(makhana)',\n- 'jute',\n- 'papaya',\n- 'tea',\n- 'cardamom',\n- 'almond'\n"
},
"typeVersion": 1
},
{
"id": "b9943781-de1f-4129-9b81-ed836e9ebb11",
"name": "Embed image",
"type": "n8n-nodes-base.httpRequest",
"position": [
680,
60
],
"parameters": {
"url": "https://api.voyageai.com/v1/multimodalembeddings",
"method": "POST",
"options": {},
"jsonBody": "={{\n{\n \"inputs\": [\n {\n \"content\": [\n {\n \"type\": \"image_url\",\n \"image_url\": $('Image URL hardcode').first().json.imageURL\n }\n ]\n }\n ],\n \"model\": \"voyage-multimodal-3\",\n \"input_type\": \"document\"\n}\n}}",
"sendBody": true,
"specifyBody": "json",
"authentication": "genericCredentialType",
"genericAuthType": "httpHeaderAuth"
},
"credentials": {
"httpHeaderAuth": {
"id": "Vb0RNVDnIHmgnZOP",
"name": "Voyage API"
}
},
"typeVersion": 4.2
},
{
"id": "47b72bc2-4817-48c6-b517-c1328e402468",
"name": "Get similarity of medoids",
"type": "n8n-nodes-base.httpRequest",
"position": [
940,
60
],
"parameters": {
"url": "={{ $('Variables for medoids').first().json.qdrantCloudURL }}/collections/{{ $('Variables for medoids').first().json.collectionName }}/points/query",
"method": "POST",
"options": {},
"jsonBody": "={{\n{\n \"query\": $json.data[0].embedding,\n \"using\": \"voyage\",\n \"limit\": $('Info About Crop Labeled Clusters').first().json.cropsNumber,\n \"with_payload\": true,\n \"filter\": {\n \"must\": [\n { \n \"key\": $('Variables for medoids').first().json.clusterCenterType,\n \"match\": {\n \"value\": true\n }\n }\n ]\n }\n}\n}}",
"sendBody": true,
"specifyBody": "json",
"authentication": "predefinedCredentialType",
"nodeCredentialType": "qdrantApi"
},
"credentials": {
"qdrantApi": {
"id": "it3j3hP9FICqhgX6",
"name": "QdrantApi account"
}
},
"typeVersion": 4.2
},
{
"id": "42d7eb27-ec38-4406-b5c4-27eb45358e93",
"name": "Compare scores",
"type": "n8n-nodes-base.code",
"position": [
1140,
60
],
"parameters": {
"language": "python",
"pythonCode": "points = _input.first()['json']['result']['points']\nthreshold_type = _('Variables for medoids').first()['json']['clusterThresholdCenterType']\n\nmax_score = -1\ncrop_with_max_score = None\nundefined = True\n\nfor center in points:\n if center['score'] >= center['payload'][threshold_type]:\n undefined = False\n if center['score'] > max_score:\n max_score = center['score']\n crop_with_max_score = center['payload']['crop_name']\n\nif undefined:\n result_message = \"ALERT, we might have a new undefined crop!\"\nelse:\n result_message = f\"Looks similar to {crop_with_max_score}\"\n\nreturn [{\n \"json\": {\n \"result\": result_message\n }\n}]\n"
},
"typeVersion": 2
},
{
"id": "23aa604a-ff0b-4948-bcd5-af39300198c0",
"name": "Sticky Note4",
"type": "n8n-nodes-base.stickyNote",
"position": [
-1200,
-220
],
"parameters": {
"width": 400,
"height": 380,
"content": "## Crop Anomaly Detection Tool\n### This is the tool that can be used directly for anomalous crops detection. \nIt takes as input (any) **image URL** and returns a **text message** telling if whatever this image depicts is anomalous to the crop dataset stored in Qdrant. \n\n* An Image URL is received via the Execute Workflow Trigger which is used to generate embedding vectors via the Voyage.ai Embeddings API.\n* The returned vectors are used to query the Qdrant collection to determine if the given crop is known by comparing it to **threshold scores** of each image class (crop type).\n* If the image scores lower than all thresholds, then the image is considered an anomaly for the dataset."
},
"typeVersion": 1
},
{
"id": "3a79eca2-44f9-4aee-8a0d-9c7ca2f9149d",
"name": "Variables for medoids",
"type": "n8n-nodes-base.set",
"position": [
-200,
60
],
"parameters": {
"options": {},
"assignments": {
"assignments": [
{
"id": "dbbc1e7b-c63e-4ff1-9524-8ef3e9f6cd48",
"name": "clusterCenterType",
"type": "string",
"value": "is_medoid"
},
{
"id": "a994ce37-2530-4030-acfb-ec777a7ddb05",
"name": "qdrantCloudURL",
"type": "string",
"value": "https://152bc6e2-832a-415c-a1aa-fb529f8baf8d.eu-central-1-0.aws.cloud.qdrant.io"
},
{
"id": "12f0a9e6-686d-416e-a61b-72d034ec21ba",
"name": "collectionName",
"type": "string",
"value": "=agricultural-crops"
},
{
"id": "4c88a617-d44f-4776-b457-8a1dffb1d03c",
"name": "clusterThresholdCenterType",
"type": "string",
"value": "is_medoid_cluster_threshold"
}
]
}
},
"typeVersion": 3.4
},
{
"id": "13b25434-bd66-4293-93f1-26c67b9ec7dd",
"name": "Sticky Note3",
"type": "n8n-nodes-base.stickyNote",
"position": [
-340,
260
],
"parameters": {
"color": 6,
"width": 360,
"height": 200,
"content": "**clusterCenterType** - either\n* \"is_text_anchor_medoid\" or\n* \"is_medoid\"\n\n\n**clusterThresholdCenterType** - either\n* \"is_text_anchor_medoid_cluster_threshold\" or\n* \"is_medoid_cluster_threshold\""
},
"typeVersion": 1
},
{
"id": "869b0962-6cae-487d-8230-539a0cc4c14c",
"name": "Info About Crop Labeled Clusters",
"type": "n8n-nodes-base.set",
"position": [
440,
60
],
"parameters": {
"options": {},
"assignments": {
"assignments": [
{
"id": "5327b254-b703-4a34-a398-f82edb1d6d6b",
"name": "=cropsNumber",
"type": "number",
"value": "={{ $json.result.hits.length }}"
}
]
}
},
"typeVersion": 3.4
},
{
"id": "5d3956f8-f43b-439e-b176-a594a21d8011",
"name": "Total Points in Collection",
"type": "n8n-nodes-base.httpRequest",
"position": [
40,
60
],
"parameters": {
"url": "={{ $json.qdrantCloudURL }}/collections/{{ $json.collectionName }}/points/count",
"method": "POST",
"options": {},
"jsonBody": "={\n \"exact\": true\n}",
"sendBody": true,
"specifyBody": "json",
"authentication": "predefinedCredentialType",
"nodeCredentialType": "qdrantApi"
},
"credentials": {
"qdrantApi": {
"id": "it3j3hP9FICqhgX6",
"name": "QdrantApi account"
}
},
"typeVersion": 4.2
},
{
"id": "14ba3db9-3965-4b20-b333-145616d45c3a",
"name": "Each Crop Counts",
"type": "n8n-nodes-base.httpRequest",
"position": [
240,
60
],
"parameters": {
"url": "={{ $('Variables for medoids').first().json.qdrantCloudURL }}/collections/{{ $('Variables for medoids').first().json.collectionName }}/facet",
"method": "POST",
"options": {},
"jsonBody": "={{\n{\n \"key\": \"crop_name\",\n \"limit\": $json.result.count,\n \"exact\": true\n}\n}}",
"sendBody": true,
"specifyBody": "json",
"authentication": "predefinedCredentialType",
"nodeCredentialType": "qdrantApi"
},
"credentials": {
"qdrantApi": {
"id": "it3j3hP9FICqhgX6",
"name": "QdrantApi account"
}
},
"typeVersion": 4.2
},
{
"id": "e37c6758-0556-4a56-ab14-d4df663cb53a",
"name": "Image URL hardcode",
"type": "n8n-nodes-base.set",
"position": [
-480,
60
],
"parameters": {
"options": {},
"assignments": {
"assignments": [
{
"id": "46ceba40-fb25-450c-8550-d43d8b8aa94c",
"name": "imageURL",
"type": "string",
"value": "={{ $json.query.imageURL }}"
}
]
}
},
"typeVersion": 3.4
},
{
"id": "b24ad1a7-0cf8-4acc-9c18-6fe9d58b10f2",
"name": "Execute Workflow Trigger",
"type": "n8n-nodes-base.executeWorkflowTrigger",
"position": [
-720,
60
],
"parameters": {},
"typeVersion": 1
},
{
"id": "50424f2b-6831-41bf-8de4-81f69d901ce1",
"name": "Sticky Note2",
"type": "n8n-nodes-base.stickyNote",
"position": [
-240,
-80
],
"parameters": {
"width": 180,
"height": 120,
"content": "Variables to access Qdrant's collection we uploaded & prepared for anomaly detection in 2 previous pipelines\n"
},
"typeVersion": 1
},
{
"id": "2e8ed3ca-1bba-4214-b34b-376a237842ff",
"name": "Sticky Note5",
"type": "n8n-nodes-base.stickyNote",
"position": [
40,
-120
],
"parameters": {
"width": 560,
"height": 140,
"content": "These three nodes are needed just to figure out how many different classes (crops) we have in our Qdrant collection: **cropsNumber** (needed in *\"Get similarity of medoids\"* node. \n[Note] *\"Total Points in Collection\"* -> *\"Each Crop Counts\"* were used&explained already in *\"[2/4] Set up medoids (2 types) for anomaly detection (crops dataset)\"* pipeline.\n"
},
"typeVersion": 1
},
{
"id": "e2fa5763-6e97-4ff5-8919-1cb85a3c6968",
"name": "Sticky Note6",
"type": "n8n-nodes-base.stickyNote",
"position": [
620,
240
],
"parameters": {
"height": 120,
"content": "Here, we're embedding the image passed to this workflow tool with the Voyage embedding model to compare the image to all crop images in the database."
},
"typeVersion": 1
},
{
"id": "cdb6b8d3-f7f4-4d66-850f-ce16c8ed98b9",
"name": "Sticky Note7",
"type": "n8n-nodes-base.stickyNote",
"position": [
920,
220
],
"parameters": {
"width": 400,
"height": 180,
"content": "Checking how similar the image is to all the centres of clusters (crops).\nIf it's more similar to the thresholds we set up and stored in centres in the previous workflow, the image probably belongs to this crop class; otherwise, it's anomalous to the class. \nIf image is anomalous to all the classes, it's an anomaly."
},
"typeVersion": 1
},
{
"id": "03b4699f-ba43-4f5f-ad69-6f81deea2641",
"name": "Sticky Note22",
"type": "n8n-nodes-base.stickyNote",
"position": [
-620,
580
],
"parameters": {
"color": 4,
"width": 540,
"height": 300,
"content": "### For anomaly detection\n1. The first pipeline is uploading (crops) dataset to Qdrant's collection.\n2. The second pipeline sets up cluster (class) centres in this Qdrant collection & cluster (class) threshold scores.\n3. **This is the anomaly detection tool, which takes any image as input and uses all preparatory work done with Qdrant (crops) collection.**\n\n### To recreate it\nYou'll have to upload [crops](https://www.kaggle.com/datasets/mdwaquarazam/agricultural-crops-image-classification) dataset from Kaggle to your own Google Storage bucket, and re-create APIs/connections to [Qdrant Cloud](https://qdrant.tech/documentation/quickstart-cloud/) (you can use **Free Tier** cluster), Voyage AI API & Google Cloud Storage\n\n**In general, pipelines are adaptable to any dataset of images**\n"
},
"typeVersion": 1
}
],
"active": false,
"pinData": {
"Execute Workflow Trigger": [
{
"json": {
"query": {
"imageURL": "https://storage.googleapis.com/n8n-qdrant-demo/agricultural-crops%2Fcotton%2Fimage%20(36).jpg"
}
}
}
]
},
"settings": {
"executionOrder": "v1"
},
"versionId": "f67b764b-9e1a-4db0-b9f2-490077a62f74",
"connections": {
"Embed image": {
"main": [
[
{
"node": "Get similarity of medoids",
"type": "main",
"index": 0
}
]
]
},
"Each Crop Counts": {
"main": [
[
{
"node": "Info About Crop Labeled Clusters",
"type": "main",
"index": 0
}
]
]
},
"Image URL hardcode": {
"main": [
[
{
"node": "Variables for medoids",
"type": "main",
"index": 0
}
]
]
},
"Variables for medoids": {
"main": [
[
{
"node": "Total Points in Collection",
"type": "main",
"index": 0
}
]
]
},
"Execute Workflow Trigger": {
"main": [
[
{
"node": "Image URL hardcode",
"type": "main",
"index": 0
}
]
]
},
"Get similarity of medoids": {
"main": [
[
{
"node": "Compare scores",
"type": "main",
"index": 0
}
]
]
},
"Total Points in Collection": {
"main": [
[
{
"node": "Each Crop Counts",
"type": "main",
"index": 0
}
]
]
},
"Info About Crop Labeled Clusters": {
"main": [
[
{
"node": "Embed image",
"type": "main",
"index": 0
}
]
]
}
}
}Workflow n8n agriculture, data analysis, anomalies : pour qui est ce workflow ?
Ce workflow s'adresse principalement aux agriculteurs, aux chercheurs en agriculture et aux data scientists travaillant dans le secteur agro-alimentaire. Il est conçu pour des utilisateurs ayant un niveau technique intermédiaire, souhaitant automatiser l'analyse de données sur les cultures.
Workflow n8n agriculture, data analysis, anomalies : problème résolu
Ce workflow résout le problème de la détection tardive des anomalies dans les cultures, ce qui peut entraîner des pertes significatives. En automatisant le processus d'analyse des données, il permet aux utilisateurs de réagir rapidement aux problèmes, d'optimiser les rendements et de minimiser les risques liés à la santé des cultures. Les utilisateurs peuvent ainsi s'assurer que leurs cultures sont surveillées en continu, ce qui améliore la prise de décision.
Workflow n8n agriculture, data analysis, anomalies : étapes du workflow
Étape 1 : Initialisation avec une note sticky pour le contexte.
- Étape 1 : Intégration d'une image via une requête HTTP.
- Étape 2 : Calcul de la similarité des médoïdes avec une autre requête HTTP.
- Étape 3 : Comparaison des scores à l'aide d'un code Python.
- Étape 4 : Ajout d'une note sticky pour des annotations supplémentaires.
- Étape 5 : Définition des variables pour les médoïdes.
- Étape 6 : Analyse des clusters étiquetés.
- Étape 7 : Comptage total des points dans la collection.
- Étape 8 : Analyse individuelle de chaque crop via des requêtes HTTP.
Workflow n8n agriculture, data analysis, anomalies : guide de personnalisation
Pour personnaliser ce workflow, vous pouvez modifier les paramètres des nœuds HTTP, tels que les URL et les méthodes d'authentification. Assurez-vous d'adapter les requêtes pour qu'elles correspondent à vos données spécifiques sur les cultures. Vous pouvez également ajuster le code Python dans le nœud de comparaison des scores pour affiner l'analyse selon vos besoins. Enfin, n'hésitez pas à ajouter ou supprimer des notes sticky pour mieux structurer vos annotations et commentaires.