{"id":4025,"date":"2026-01-28T07:00:00","date_gmt":"2026-01-27T22:00:00","guid":{"rendered":"https:\/\/www.litcoder.com\/?p=4025"},"modified":"2026-01-27T21:34:57","modified_gmt":"2026-01-27T12:34:57","slug":"openvino-object-detection-model%ec%9d%98-%ec%a0%84%ec%b2%98%eb%a6%ac%ec%99%80-%ed%9b%84%ec%b2%98%eb%a6%ac%eb%a5%bc-%ea%b0%84%eb%8b%a8%ed%95%98%ea%b2%8c","status":"publish","type":"post","link":"https:\/\/litcoder.com\/?p=4025","title":{"rendered":"OpenVINO Object Detection Model\uc758 \uc804\ucc98\ub9ac\uc640 \ud6c4\ucc98\ub9ac\ub97c \uac04\ub2e8\ud558\uac8c"},"content":{"rendered":"\n<p>AI model\ub4e4\uc740 \uc785\ucd9c\ub825 layer\uc758 \uad6c\uc870\uac00 \ub2e4\ub974\ubbc0\ub85c \uc11c\ub85c \ub2e4\ub978 pre\/post processing\uc744 \ud544\uc694\ub85c \ud55c\ub2e4. Vision model\uc744 \uc608\ub85c \ub4e4\uba74 \uc5b4\ub5a4 \ubaa8\ub378\uc740 \uc785\ub825\uc744 416&#215;416\uc73c\ub85c \ubc1b\uac8c \ub418\uc5b4 \uc788\uc5b4\uc11c \uc785\ub825 \uc804\uc5d0 \uc6d0\ubcf8 \ud06c\uae30\ub85c \ubd80\ud130 resizing\uc744 \ud574\uc8fc\uc5b4\uc57c \ud558\uace0, \uc5b4\ub5a4 \ubaa8\ub378\uc740 resizing layer\uc744 \ud3ec\ud568\ud558\uace0 \uc788\uc5b4\uc11c \uadf8\ub0e5 \uc785\ub825\ud574\ub3c4 \uc798 \ub3d9\uc791\ud558\uae30\ub3c4 \ud55c\ub2e4. \ub610\ud55c \uc5b4\ub5a4 \ubaa8\ub378\uc740 \uc785\ub825\uac12\uc744 \uc815\uaddc\ud654 \ud574\uc11c \uc785\ub825\ud574 \uc8fc\uc5b4\uc57c \ud558\uace0, \uc5b4\ub5a4 \ubaa8\ub378\uc740 \uc815\uaddc\ud654\uac00 \ud544\uc694 \uc5c6\uae30\ub3c4 \ud558\ub2e4.<\/p>\n\n\n\n<p>\uc785\ub825\uce35 \ubfd0 \uc544\ub2c8\ub77c \ucd9c\ub825\uce35\ub3c4 \uc81c\uac01\uac01\uc774\ub2e4. Object detection\uc744 \uc218\ud589\ud558\ub294 \ubaa8\ub378\ub4e4\uc744 \ubcf4\uba74 YOLO\uac19\uc740 \ubaa8\ub378\uc740 score\uac12\uc744 bounding box\uc640 \ud568\uaed8 \uc804\ub2ec\ud558\uace0, D-Fine\uac19\uc740 \ubaa8\ub378\uc740 bounding box, score, label\uc744 \uac01\uac01 \ub530\ub85c \ucd9c\ub825\ud558\uae30\ub3c4 \ud55c\ub2e4.<\/p>\n\n\n\n<p>\uc5b4\ub5a4 \ubaa8\ub378\uc758 \uc785\ucd9c\ub825 \ud615\ud0dc\uc758 \ucc28\uc774\ub97c \ubcf4\uace0 \uc2f6\uc73c\uba74 \uc77d\uc5b4\ub4e4\uc778 \ubaa8\ub378\uc758 inputs\uc640 outputs \ubcc0\uc218\uc758 \ub0b4\uc6a9\uc744 \uc0b4\ud3b4\ubcf4\uba74 \ub41c\ub2e4.<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">def print_model_io_layer(ovcore: Core, model_path: str, model_name: str):\n    read_model = ovcore.read_model(model_path)\n    print(f\"[{model_name}]\")\n\n    # Display input layer\n    for idx, input_layer in enumerate(read_model.inputs):\n        print(f\"Input({idx+1}):\")\n        print(\"  any_name     :\", input_layer.get_any_name())\n        print(\"  names        :\", input_layer.get_names())\n        print(\"  shape        :\", input_layer.get_partial_shape())\n        print(\"  element type :\", input_layer.get_element_type())\n\n    # Display output layer\n    for idx, output_layer in enumerate(read_model.outputs):\n        print(f\"Output({idx+1}):\")\n        print(\"  any_name     :\", output_layer.get_any_name())\n        print(\"  names        :\", output_layer.get_names())\n        print(\"  shape        :\", output_layer.get_partial_shape())\n        print(\"  element type :\", output_layer.get_element_type())<\/pre>\n\n\n\n<p>\uc774\ub807\uac8c \ub2e4\uc591\ud55c \ubaa8\ub378\ub4e4\uc758 \uc785\ucd9c\ub825\uc744 \uc9c1\uc811 \ucc98\ub9ac\ud558\ub294 \uac83\uc740 \uc5ec\uac04 \ubc88\uac70\ub85c\uc6b4 \uc77c\uc774 \uc544\ub2c8\ub2e4. OpenVINO\uc5d0\uc11c\ub294 \ubaa8\ub378\uc5d0 \uc804\/\ud6c4\ucc98\ub9ac\ub97c \uc0ac\uc6a9\uc790\uac00 \uc9c0\uc815\ud560 \uc218 \uc788\ub294 <a href=\"https:\/\/docs.openvino.ai\/2025\/api\/ie_python_api\/_autosummary\/openvino.preprocess.PrePostProcessor.html\">PrePostProcessor<\/a>\ub97c \uc9c0\uc6d0\ud558\uae30\ub294 \ud558\uc9c0\ub9cc \uc774 \ub9c8\uc800\ub3c4 \ubaa8\ub378\ubcc4\ub85c \uc0c1\uc774\ud55c \ubd80\ubd84\uc744 \uc9c1\uc811 \uc218\uc815\ud574 \uc8fc\uc5b4\uc57c \ud558\uae30 \ub54c\ubb38\uc5d0 \ubc88\uac70\ub85c\uc6b4 \uac83\uc740 \ub9c8\ucc2c\uac00\uc9c0\ub2e4.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">OpenVINO Model API \uc0ac\uc6a9\ubc95<\/h2>\n\n\n\n<p>\uc608\ub97c \ub4e4\uc5b4 ATSS, YOLOX-S, YOLOX-Tiny, D-Fine \ub124\uac1c\uc758 OpenVINO model\uacfc \uac01\uac01\uc5d0 \ub300\ud55c ONNX model\ub4e4\uae4c\uc9c0 8\uac1c\uc758 \uc11c\ub85c \ub2e4\ub978 \uc138\uac1c\uc758 \uc11c\ub85c \ub2e4\ub978 \ubaa8\ub378\ub4e4\uc774 \uc788\ub2e4\uace0 \ud560 \ub54c, 8\uac1c\uc758 \uc11c\ub85c \ub2e4\ub978 \uc785\ucd9c\ub825 \uacb0\uacfc\ub97c \ucc98\ub9ac\ud558\ub294 \uac83\uc740 \uc544\uc8fc \ub9ce\uc740 \ub178\ub825\uc744 \ud544\uc694\ub85c \ud560 \uac83\uc774\ub2e4.<\/p>\n\n\n\n<p><a href=\"https:\/\/pypi.org\/project\/openvino-model-api\/\" data-type=\"link\" data-id=\"https:\/\/pypi.org\/project\/openvino-model-api\/\">openvino-model-api<\/a>\ub97c \uc0ac\uc6a9\ud558\uba74 \uc774\ub7ec\ud55c \ubd80\ubd84\uc744 \ube44\uad50\uc801 \uac04\ub2e8\ud558\uac8c \ucc98\ub9ac \ud560 \uc218 \uc788\ub2e4. \uc0ac\uc6a9\ud558\ub824\uba74 \ub2e4\uc74c\uacfc \uac19\uc774 \ud544\uc694\ud55c \ud328\ud0a4\uc9c0\ub4e4\uc744 \uc124\uce58\ud574 \uc900\ub2e4.<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"dracula\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">pip install openvino openvino-model-api onnx<\/pre>\n\n\n\n<p>\uc124\uce58\uac00 \ub05d\ub098\uba74 \uba3c\uc800 \uc11c\ub85c \ub2e4\ub978 \uc785\ucd9c\ub825 \uacc4\uce35\uc744 \ub9de\ucdb0 \uc8fc\ub294 \ubaa8\ub378\uc744 \uc0dd\uc131\ud55c\ub2e4. \ucd94\ub860\uc5d0 \uc0ac\uc6a9\ud558\uace0\uc790 \ud558\ub294 \ubaa8\ub378 \ud30c\uc77c\ub85c OpenvinoAdapter\ub97c \ub9cc\ub4e4\uc5b4\uc11c create_model()\uc5d0 \ub118\uaca8 \uc900\ub2e4. \uc774\ub807\uac8c \uc0dd\uc131\ub41c \ubaa8\ub378\uc740 \uc785\ub825\uc744 \uc704\ud55c \uc804\ucc98\ub9ac \uacfc\uc815\uc744 \uc790\ub3d9\uc73c\ub85c \uc218\ud589\ud55c\ub2e4.<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">from openvino import Core\nfrom model_api.adapters import OpenvinoAdapter\nfrom model_api.models import Model\n\n\novcore = Core()\n\n# Adapter\ub97c \uc0dd\uc131\ud558\uace0 create_model()\uc5d0 \ub118\uaca8\uc900\ub2e4.\novadapter = OpenvinoAdapter(ovcore, model_path)\nmodel = Model.create_model(ovadapter, preload=True)<\/pre>\n\n\n\n<p>\uadf8\ub9ac\uace0 \ub098\uc11c \ucd94\ub860\uc744 \uc2e4\ud589\ud558\uba74 \uadf8 \uacb0\uacfc\ub294 DetectionResult \ud0c0\uc785\uc73c\ub85c \uc815\ub9ac\ub418\uc5b4 \ubc18\ud658\ub41c\ub2e4. \ub530\ub77c\uc11c \ubaa8\ub378\ubcc4\ub85c \uc11c\ub85c\ub2e4\ub978 \ud6c4\ucc98\ub9ac(post processing) \uacfc\uc815 \ud544\uc694 \uc5c6\uc774 DetectionResult\uc73c\ub85c \ubc18\ud658\ub418\ub294 \uacb0\uacfc\ub97c \ucc98\ub9ac\ud574\uc8fc\uba74 \ub41c\ub2e4.<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\"># \ucd94\ub860 \uc2e4\ud589\ndetection_result = model(test_image)<\/pre>\n\n\n\n<p>\uacb0\uacfc\ubb3c \ucc98\ub9ac\ub97c \uc704\ud574\uc11c detection_result\ub97c \uc6d0\ubcf8 \uc774\ubbf8\uc9c0\uc5d0 overlay\ud558\ub294 \ud568\uc218\ub85c \ub9cc\ub4e4\uc5c8\ub2e4(overlay_detection_result). \uc774 \ud568\uc218\ub294 Threshold\uac12\uc744 \ub118\ub294 \uacb0\uacfc\ubb3c\uc744 \uc6d0\ubcf8 \uc774\ubbf8\uc9c0 \uc704\uc5d0 bounding box\uc640 label\ub85c \ud45c\uc2dc\ud558\uace0 detect\ub41c object\uc758 \uac2f\uc218\uc640 \ud568\uaed8 \ubc18\ud658\ud574 \uc900\ub2e4.<\/p>\n\n\n\n<p>DetectionResult\uc5d0\uc11c \ucc38\uc870\ub418\ub294 \uc8fc\uc694\ud55c \uba64\ubc84\ubcc0\uc218\ub294 Socre\uac12\uc744 \ub098\ud0c0\ub0b4\ub294 <strong>.score<\/strong>, bounding box\ub97c \ub098\ud0c0\ub0b4\ub294 .<strong>xmin, .ymin, .xmax, .ymax<\/strong> \uadf8\ub9ac\uace0 label\uc744 \uac00\uc9c0\uace0 \uc788\ub294 <strong>.str_label<\/strong>\uc774\ub2e4. Bounding box \uc88c\ud45c\ub3c4 \uc785\ub825 \uc774\ubbf8\uc9c0\uc758 \ud06c\uae30\uc5d0 \ub9de\uac8c \uc774\ubbf8 scaling\ub418\uc5b4 \uc788\uc73c\ubbc0\ub85c \uc88c\ud45c\ub97c \uadf8\ub300\ub85c \uac00\uc838\ub2e4 \uc4f0\uba74  \ub41c\ub2e4.<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">from model_api.models.utils import DetectionResult\n\n\ndef overlay_detection_result(\n    original_image: np.ndarray, detection_result: DetectionResult, threshold: float\n) -> Tuple[np.ndarray, int]:\n    num_detected = 0\n    processed_image = original_image.copy()\n\n    for det in detection_result[0]:\n        score = det.score\n        if score >= threshold:\n            num_detected += 1\n\n            # Bounding box\n            x1 = int(det.xmin)\n            y1 = int(det.ymin)\n            x2 = int(det.xmax)\n            y2 = int(det.ymax)\n\n            # Clamping\n            x1 = max(0, min(x1, original_image.shape[1]))\n            y1 = max(0, min(y1, original_image.shape[0]))\n            x2 = max(0, min(x2, original_image.shape[1]))\n            y2 = max(0, min(y2, original_image.shape[0]))\n\n            # Draw BBox\n            cv2.rectangle(processed_image, (x1, y1), (x2, y2), BBOX_COLOR, 2)\n\n            # Draw label.\n            display_text = (\n                f\"{det.str_label} {score:.2f}\" if {det.str_label} else f\"{score:.2f}\"\n            )\n            cv2.putText(\n                processed_image,\n                display_text,\n                (x1, y1 - 10),\n                cv2.FONT_HERSHEY_SIMPLEX,\n                0.5,\n                LABEL_COLOR,\n                2,\n            )\n    return processed_image, num_detected<\/pre>\n\n\n\n<h2 class=\"wp-block-heading\">8\uac1c\uc758 \ubaa8\ub378\uc5d0 \uc2dc\ud5d8\ud574 \ubcf4\uc790<\/h2>\n\n\n\n<p>\uc0ac\uc6a9\ud558\ub824\ub294 8\uac1c\uc758 \ubaa8\ub378\ub4e4\uc740 \ub2e4\uc74c\uacfc \uac19\ub2e4.<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">atss_model_file = os.path.join(MODEL_PATH, \"atss_fp32\/model.xml\")\natss_onnx_model_file = os.path.join(MODEL_PATH, \"atss_onnx_fp32\/model.onnx\")\nyolos_model_file = os.path.join(MODEL_PATH, \"yolos_fp32\/model.xml\")\nyolos_onnx_model_file = os.path.join(MODEL_PATH, \"yolos_onnx_fp32\/model.onnx\")\nyolotiny_model_file = os.path.join(MODEL_PATH, \"yolotiny_fp32\/model.xml\")\nyolotiny_onnx_model_file = os.path.join(MODEL_PATH, \"yolotiny_onnx_fp32\/model.onnx\")\ndfinex_model_file = os.path.join(MODEL_PATH, \"dfinex_fp32\/model.xml\")\ndfinex_onnx_model_file = os.path.join(MODEL_PATH, \"dfinex_onnx_fp32\/model.onnx\")<\/pre>\n\n\n\n<p>\uc774\uac83\ub4e4\uc744 all_models\ub77c\ub294 list\uc5d0 \ub123\uace0 \ud55c\ubc88\uc5d0 \ub3cc\ub9b0\ub2e4. \uc989, \uac01 \ubaa8\ub378\ub4e4\uc758 \uc11c\ub85c \ub2e4\ub978 \uc785\ub825\uacfc \ucd9c\ub825 \uacc4\uce35\uc5d0 \ub300\ud55c \ucc98\ub9ac\ub97c \ub2e8\uc77c\ud55c \ucf54\ub4dc\ub85c \uc218\ud589\ud558\ub294 \uac83\uc774\ub2e4.<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ovcore = Core()\n\n\n# Test image\ntest_image = cv2.imread(\".\/data\/test_image.jpg\")\n\nthreshold = 0.8\nfor m in all_models:\n    model_path = m[0]\n    model_name = m[1]\n    ovadapter = OpenvinoAdapter(ovcore, model_path)\n    model = Model.create_model(ovadapter, preload=True)\n    detection_result = model(test_image)\n    proc_image, num_det = overlay_detection_result(\n        test_image, detection_result, threshold\n    )\n\n    print(f\"{model_name} :: {num_det} objects detected (threshold: {threshold}).\")\n    cv2.imwrite(f\"output_{model_name}.jpg\", proc_image)\n<\/pre>\n\n\n\n<h2 class=\"wp-block-heading\">\uc804\uccb4 \uc18c\uc2a4\ucf54\ub4dc<\/h2>\n\n\n\n<script src=\"https:\/\/gist.github.com\/litcoder\/971ce8d9f079017dcabb0f5421e55086.js\"><\/script>\n","protected":false},"excerpt":{"rendered":"<p>AI model\ub4e4\uc740 \uc785\ucd9c\ub825 layer\uc758 \uad6c\uc870\uac00 \ub2e4\ub974\ubbc0\ub85c \uc11c\ub85c \ub2e4\ub978 pre\/post processing\uc744 \ud544\uc694\ub85c \ud55c\ub2e4. Vision model\uc744 \uc608\ub85c \ub4e4\uba74 \uc5b4\ub5a4 \ubaa8\ub378\uc740 \uc785\ub825\uc744 416&#215;416\uc73c\ub85c \ubc1b\uac8c \ub418\uc5b4 \uc788\uc5b4\uc11c \uc785\ub825 \uc804\uc5d0 \uc6d0\ubcf8 \ud06c\uae30\ub85c \ubd80\ud130 resizing\uc744 \ud574\uc8fc\uc5b4\uc57c \ud558\uace0, \uc5b4\ub5a4 \ubaa8\ub378\uc740 resizing layer\uc744 \ud3ec\ud568\ud558\uace0 \uc788\uc5b4\uc11c \uadf8\ub0e5 \uc785\ub825\ud574\ub3c4 \uc798 \ub3d9\uc791\ud558\uae30\ub3c4 \ud55c\ub2e4. \ub610\ud55c \uc5b4\ub5a4 \ubaa8\ub378\uc740 \uc785\ub825\uac12\uc744 \uc815\uaddc\ud654 \ud574\uc11c \uc785\ub825\ud574 \uc8fc\uc5b4\uc57c \ud558\uace0, \uc5b4\ub5a4 \ubaa8\ub378\uc740 \uc815\uaddc\ud654\uac00 \ud544\uc694 [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[5],"tags":[393,275,391,390,105,392],"class_list":["post-4025","post","type-post","status-publish","format-standard","hentry","category-programming","tag-ai","tag-openvino","tag-openvino-model-api","tag-prepostprocessor","tag-python","tag-vision"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/litcoder.com\/index.php?rest_route=\/wp\/v2\/posts\/4025","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/litcoder.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/litcoder.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/litcoder.com\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/litcoder.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=4025"}],"version-history":[{"count":13,"href":"https:\/\/litcoder.com\/index.php?rest_route=\/wp\/v2\/posts\/4025\/revisions"}],"predecessor-version":[{"id":4103,"href":"https:\/\/litcoder.com\/index.php?rest_route=\/wp\/v2\/posts\/4025\/revisions\/4103"}],"wp:attachment":[{"href":"https:\/\/litcoder.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=4025"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/litcoder.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=4025"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/litcoder.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=4025"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}