Yao_Feng_Joint_3D_Face_ECCV_2018_paper.zip

  • lycheeCong
    了解作者
  • C/C++
    开发工具
  • 1.1MB
    文件大小
  • zip
    文件格式
  • 0
    收藏次数
  • 1 积分
    下载积分
  • 0
    下载次数
  • 2021-04-28 22:45
    上传日期
2018 3dmm模型论文,对于做人脸识别和人脸模型还原来说由帮助
Yao_Feng_Joint_3D_Face_ECCV_2018_paper.zip
  • Yao_Feng_Joint_3D_Face_ECCV_2018_paper.pdf
    1.2MB
内容介绍
<html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta charset="utf-8"> <meta name="generator" content="pdf2htmlEX"> <meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1"> <link rel="stylesheet" href="https://static.pudn.com/base/css/base.min.css"> <link rel="stylesheet" href="https://static.pudn.com/base/css/fancy.min.css"> <link rel="stylesheet" href="https://static.pudn.com/prod/directory_preview_static/625cf355be9ad24cfa6f98d8/raw.css"> <script src="https://static.pudn.com/base/js/compatibility.min.js"></script> <script src="https://static.pudn.com/base/js/pdf2htmlEX.min.js"></script> <script> try{ pdf2htmlEX.defaultViewer = new pdf2htmlEX.Viewer({}); }catch(e){} </script> <title></title> </head> <body> <div id="sidebar" style="display: none"> <div id="outline"> </div> </div> <div id="pf1" class="pf w0 h0" data-page-no="1"><div class="pc pc1 w0 h0"><img class="bi x0 y0 w1 h1" alt="" src="https://static.pudn.com/prod/directory_preview_static/625cf355be9ad24cfa6f98d8/bg1.jpg"><div class="t m0 x1 h2 y1 ff1 fs0 fc0 sc0 ls0 ws0">Join<span class="_ _0"></span>t<span class="_ _1"> </span>3D<span class="_ _1"> </span>F<span class="_ _2"></span>ace<span class="_ _1"> </span>Reconstruction<span class="_ _1"> </span>and<span class="_ _1"> </span>Dense</div><div class="t m0 x2 h2 y2 ff1 fs0 fc0 sc0 ls0 ws0">Alignmen<span class="_ _0"></span>t<span class="_ _1"> </span>with<span class="_ _1"> </span>P<span class="_ _0"></span>osition<span class="_ _1"> </span>Map<span class="_ _1"> </span>Regression</div><div class="t m0 x3 h2 y3 ff1 fs0 fc0 sc0 ls0 ws0">Net<span class="_ _0"></span>w<span class="_ _0"></span>ork</div><div class="t m0 x4 h3 y4 ff2 fs1 fc0 sc0 ls0 ws0">Y<span class="_ _3"></span>ao<span class="_ _4"> </span>F<span class="_ _3"></span>eng</div><div class="t m0 x5 h4 y5 ff3 fs2 fc0 sc0 ls0 ws0">1<span class="_ _5"></span>[0000<span class="ff4">&#8722;</span>0002<span class="ff4">&#8722;</span>9481<span class="ff4">&#8722;</span>9783]</div><div class="t m0 x6 h3 y4 ff2 fs1 fc0 sc0 ls0 ws0">,<span class="_ _4"> </span>F<span class="_ _3"></span>an<span class="_ _4"> </span>W<span class="_ _3"></span>u</div><div class="t m0 x7 h4 y5 ff3 fs2 fc0 sc0 ls0 ws0">2<span class="_ _5"></span>[0000<span class="ff4">&#8722;</span>0003<span class="ff4">&#8722;</span>1970<span class="ff4">&#8722;</span>3470]</div><div class="t m0 x8 h3 y4 ff2 fs1 fc0 sc0 ls0 ws0">,<span class="_ _4"> </span>Xiaoh<span class="_ _0"></span>u</div><div class="t m0 x9 h3 y6 ff2 fs1 fc0 sc0 ls0 ws0">Shao</div><div class="t m0 xa h4 y7 ff3 fs2 fc0 sc0 ls0 ws0">3<span class="ff5">,</span>4<span class="_ _5"></span>[0000<span class="ff4">&#8722;</span>0003<span class="ff4">&#8722;</span>1141<span class="ff4">&#8722;</span>6020]</div><div class="t m0 xb h3 y6 ff2 fs1 fc0 sc0 ls0 ws0">,<span class="_ _4"> </span>Y<span class="_ _3"></span>anfeng<span class="_ _4"> </span>W<span class="_ _3"></span>ang</div><div class="t m0 xc h4 y7 ff3 fs2 fc0 sc0 ls0 ws0">1<span class="_ _5"></span>[0000<span class="ff4">&#8722;</span>0002<span class="ff4">&#8722;</span>3196<span class="ff4">&#8722;</span>2347]</div><div class="t m0 xd h3 y6 ff2 fs1 fc0 sc0 ls0 ws0">,<span class="_ _4"> </span>and<span class="_ _4"> </span>Xi</div><div class="t m0 xe h3 y8 ff2 fs1 fc0 sc0 ls0 ws0">Zhou</div><div class="t m0 xf h4 y9 ff3 fs2 fc0 sc0 ls0 ws0">1<span class="ff5">,</span>2<span class="_ _5"></span>[0000<span class="ff4">&#8722;</span>0003<span class="ff4">&#8722;</span>2917<span class="ff4">&#8722;</span>0436]</div><div class="t m0 x10 h5 ya ff6 fs3 fc0 sc0 ls0 ws0">1</div><div class="t m0 x11 h6 yb ff7 fs4 fc0 sc0 ls0 ws0">Co<span class="_ _5"></span>operative<span class="_ _6"> </span>Medianet<span class="_ _6"> </span>In<span class="_"> </span>n<span class="_"> </span>ov<span class="_ _0"></span>ation<span class="_ _6"> </span>Center,<span class="_ _6"> </span>Shanghai<span class="_ _4"> </span>Jiao<span class="_ _6"> </span>T<span class="_ _3"></span>ong<span class="_ _4"> </span>Univ<span class="_ _0"></span>ersity</div><div class="t m0 x12 h5 yc ff6 fs3 fc0 sc0 ls0 ws0">2</div><div class="t m0 x13 h6 yd ff7 fs4 fc0 sc0 ls0 ws0">CloudW<span class="_ _3"></span>alk<span class="_ _4"> </span>T<span class="_ _3"></span>ec<span class="_ _0"></span>hnology</div><div class="t m0 x14 h5 ye ff6 fs3 fc0 sc0 ls0 ws0">3</div><div class="t m0 x15 h6 yf ff7 fs4 fc0 sc0 ls0 ws0">CIGIT,<span class="_ _6"> </span>Chinese<span class="_ _4"> </span>Academy<span class="_ _6"> </span>of<span class="_ _6"> </span>Sciences</div><div class="t m0 x16 h5 y10 ff6 fs3 fc0 sc0 ls0 ws0">4</div><div class="t m0 x17 h6 y11 ff7 fs4 fc0 sc0 ls0 ws0">Universit<span class="_ _0"></span>y<span class="_ _6"> </span>of<span class="_ _4"> </span>Chinese<span class="_ _6"> </span>Academy<span class="_ _6"> </span>of<span class="_ _4"> </span>Sciences</div><div class="t m0 x1 h6 y12 ff8 fs4 fc0 sc0 ls0 ws0">Abstract.<span class="_ _7"> </span><span class="ff7">W<span class="_ _3"></span>e<span class="_ _8"> </span>prop<span class="_ _5"></span>ose<span class="_ _9"> </span>a<span class="_ _9"> </span>straightforward<span class="_ _9"> </span>metho<span class="_ _5"></span>d<span class="_ _9"> </span>that<span class="_ _9"> </span>simultaneously</span></div><div class="t m0 x1 h6 y13 ff7 fs4 fc0 sc0 ls0 ws0">reconstructs<span class="_ _8"> </span>the<span class="_ _9"> </span>3D<span class="_ _8"> </span>facial<span class="_ _8"> </span>structure<span class="_ _8"> </span>and<span class="_ _8"> </span>provides<span class="_ _9"> </span>dense<span class="_ _8"> </span>alignment.<span class="_ _9"> </span>T<span class="_ _3"></span>o</div><div class="t m0 x1 h6 y14 ff7 fs4 fc0 sc0 ls0 ws0">achiev<span class="_ _0"></span>e<span class="_ _a"> </span>this,<span class="_ _a"> </span>we<span class="_ _a"> </span>design<span class="_ _a"> </span>a<span class="_ _a"> </span>2D<span class="_ _a"> </span>representation<span class="_ _a"> </span>called<span class="_ _a"> </span>UV<span class="_ _a"> </span>p<span class="_ _5"></span>osition<span class="_ _a"> </span>map<span class="_ _a"> </span>whic<span class="_ _0"></span>h</div><div class="t m0 x1 h6 y15 ff7 fs4 fc0 sc0 ls0 ws0">records<span class="_ _4"> </span>the<span class="_ _4"> </span>3D<span class="_ _6"> </span>shap<span class="_ _5"></span>e<span class="_ _4"> </span>of<span class="_ _6"> </span>a<span class="_ _4"> </span>complete<span class="_ _4"> </span>face<span class="_ _4"> </span>in<span class="_ _4"> </span>UV<span class="_ _6"> </span>spac<span class="_"> </span>e,<span class="_ _4"> </span>then<span class="_ _4"> </span>train<span class="_ _4"> </span>a<span class="_ _4"> </span>sim-</div><div class="t m0 x1 h6 y16 ff7 fs4 fc0 sc0 ls0 ws0">ple<span class="_ _4"> </span>Con<span class="_ _0"></span>volutional<span class="_ _6"> </span>Neu<span class="_"> </span>ra<span class="_"> </span>l<span class="_ _4"> </span>Netw<span class="_ _0"></span>ork<span class="_ _4"> </span>to<span class="_ _4"> </span>regress<span class="_ _6"> </span>it<span class="_ _4"> </span>from<span class="_ _4"> </span>a<span class="_ _4"> </span>single<span class="_ _4"> </span>2D<span class="_ _6"> </span>image.</div><div class="t m0 x1 h6 y17 ff7 fs4 fc0 sc0 ls0 ws0">W<span class="_ _3"></span>e<span class="_ _9"> </span>also<span class="_ _9"> </span>in<span class="_ _0"></span>tegrate<span class="_ _9"> </span>a<span class="_ _9"> </span>weigh<span class="_ _0"></span>t<span class="_ _9"> </span>mask<span class="_ _9"> </span>in<span class="_ _0"></span>to<span class="_ _9"> </span>the<span class="_ _9"> </span>loss<span class="_ _9"> </span>function<span class="_ _9"> </span>during<span class="_ _4"> </span>train<span class="_"> </span>i<span class="_"> </span>n<span class="_"> </span>g</div><div class="t m0 x1 h6 y18 ff7 fs4 fc0 sc0 ls0 ws0">to<span class="_ _4"> </span>i<span class="_"> </span>mp<span class="_"> </span>rov<span class="_ _0"></span>e<span class="_ _9"> </span>the<span class="_ _9"> </span>performance<span class="_ _9"> </span>of<span class="_ _9"> </span>the<span class="_ _4"> </span>netw<span class="_ _0"></span>ork.<span class="_ _9"> </span>Our<span class="_ _4"> </span>metho<span class="_ _5"></span>d<span class="_ _4"> </span>do<span class="_ _5"></span>es<span class="_ _4"> </span>not<span class="_ _9"> </span>rely</div><div class="t m0 x1 h6 y19 ff7 fs4 fc0 sc0 ls0 ws0">on<span class="_ _4"> </span>any<span class="_ _6"> </span>prior<span class="_ _4"> </span>face<span class="_ _4"> </span>mo<span class="_ _5"></span>del,<span class="_ _4"> </span>and<span class="_ _4"> </span>can<span class="_ _4"> </span>reconstruct<span class="_ _4"> </span>full<span class="_ _4"> </span>facial<span class="_ _4"> </span>geometry<span class="_ _4"> </span>along</div><div class="t m0 x1 h6 y1a ff7 fs4 fc0 sc0 ls0 ws0">with<span class="_ _9"> </span>seman<span class="_ _0"></span>tic<span class="_ _9"> </span>meaning.<span class="_ _9"> </span>Meanwhile,<span class="_ _4"> </span>our<span class="_ _9"> </span>net<span class="_ _0"></span>work<span class="_ _4"> </span>is<span class="_ _9"> </span>very<span class="_ _4"> </span>light-w<span class="_ _0"></span>eighted</div><div class="t m0 x1 h6 y1b ff7 fs4 fc0 sc0 ls0 ws0">and<span class="_ _9"> </span>sp<span class="_ _5"></span>ends<span class="_ _9"> </span>only<span class="_ _8"> </span>9.8ms<span class="_ _8"> </span>to<span class="_ _9"> </span>pro<span class="_ _5"></span>cess<span class="_ _9"> </span>an<span class="_ _8"> </span>image,<span class="_ _8"> </span>which<span class="_ _9"> </span>is<span class="_ _9"> </span>extremely<span class="_ _8"> </span>faster</div><div class="t m0 x1 h6 y1c ff7 fs4 fc0 sc0 ls0 ws0">than<span class="_ _b"> </span>previous<span class="_ _b"> </span>works.<span class="_ _a"> </span>Exp<span class="_ _5"></span>erimen<span class="_ _0"></span>ts<span class="_ _b"> </span>on<span class="_ _b"> </span>multiple<span class="_ _b"> </span>c<span class="_ _0"></span>hallenging<span class="_ _b"> </span>datasets<span class="_ _b"> </span>show</div><div class="t m0 x1 h6 y1d ff7 fs4 fc0 sc0 ls0 ws0">that<span class="_ _8"> </span>our<span class="_ _8"> </span>metho<span class="_ _5"></span>d<span class="_ _8"> </span>surpasses<span class="_ _8"> </span>other<span class="_ _8"> </span>state-of-the-a<span class="_"> </span>rt<span class="_ _c"> </span>metho<span class="_ _5"></span>ds<span class="_ _9"> </span>on<span class="_ _8"> </span>b<span class="_ _5"></span>oth<span class="_ _8"> </span>re-</div><div class="t m0 x1 h6 y1e ff7 fs4 fc0 sc0 ls0 ws0">construction<span class="_ _b"> </span>and<span class="_ _6"> </span>alignment<span class="_ _b"> </span>tasks<span class="_ _b"> </span>by<span class="_ _b"> </span>a<span class="_ _b"> </span>large<span class="_ _6"> </span>margin.<span class="_ _b"> </span>Co<span class="_ _5"></span>de<span class="_ _b"> </span>is<span class="_ _b"> </span>av<span class="_ _0"></span>ailable<span class="_ _b"> </span>at</div><div class="t m0 x1 h6 y1f ff7 fs4 fc0 sc0 ls0 ws0">h<span class="_ _0"></span>ttps://github.com/Y<span class="_ _3"></span>adiraF/PRNet.</div><div class="t m0 x1 h6 y20 ff8 fs4 fc0 sc0 ls0 ws0">Keyw<span class="_ _0"></span>ords:<span class="_ _7"> </span><span class="ff7">3D<span class="_ _6"> </span>F<span class="_ _3"></span>ace<span class="_ _6"> </span>R<span class="_"> </span>eco<span class="_"> </span>n<span class="_"> </span>st<span class="_"> </span>ru<span class="_"> </span>c<span class="_"> </span>ti<span class="_"> </span>o<span class="_"> </span>n<span class="_ _4"> </span><span class="ff9">&#183;<span class="_ _6"> </span></span>Dense<span class="_ _4"> </span>F<span class="_ _3"></span>ace<span class="_ _6"> </span>Al<span class="_"> </span>ig<span class="_"> </span>n<span class="_"> </span>m<span class="_"> </span>ent</span></div><div class="t m0 x18 h7 y21 ff1 fs5 fc0 sc0 ls0 ws0">1<span class="_ _d"> </span>In<span class="_ _0"></span>tro<span class="_ _5"></span>duction</div><div class="t m0 x18 h3 y22 ff2 fs1 fc0 sc0 ls0 ws0">3D<span class="_ _9"> </span>face<span class="_ _9"> </span>reconstruction<span class="_ _9"> </span>and<span class="_ _9"> </span>face<span class="_ _9"> </span>alignment<span class="_ _9"> </span>are<span class="_ _9"> </span>t<span class="_ _0"></span>wo<span class="_ _4"> </span>fundamental<span class="_ _9"> </span>and<span class="_ _9"> </span>highly<span class="_ _9"> </span>re-</div><div class="t m0 x18 h3 y23 ff2 fs1 fc0 sc0 ls0 ws0">lated<span class="_ _b"> </span>topics<span class="_ _b"> </span>i<span class="_"> </span>n<span class="_ _6"> </span>computer<span class="_ _b"> </span>vision.<span class="_ _6"> </span>In<span class="_ _b"> </span>the<span class="_ _6"> </span>last<span class="_ _b"> </span>decades,<span class="_ _b"> </span>resear<span class="_"> </span>ches<span class="_ _b"> </span>in<span class="_ _b"> </span>these<span class="_ _6"> </span>tw<span class="_ _0"></span>o<span class="_ _b"> </span>&#64257;elds</div><div class="t m0 x18 h3 y24 ff2 fs1 fc0 sc0 ls0 ws0">b<span class="_ _5"></span>ene&#64257;t<span class="_ _b"> </span>eac<span class="_ _0"></span>h<span class="_ _b"> </span>other.<span class="_ _b"> </span>In<span class="_ _b"> </span>the<span class="_ _b"> </span>b<span class="_ _5"></span>eginning,<span class="_ _b"> </span>face<span class="_ _b"> </span>alignment<span class="_ _b"> </span>that<span class="_ _b"> </span>aims<span class="_ _b"> </span>at<span class="_ _b"> </span>detecting<span class="_ _b"> </span>a<span class="_ _b"> </span>sp<span class="_ _5"></span>e-</div><div class="t m0 x18 h3 y25 ff2 fs1 fc0 sc0 ls0 ws0">cial<span class="_ _b"> </span>2D<span class="_ _b"> </span>&#64257;ducial<span class="_ _b"> </span>p<span class="_ _5"></span>oin<span class="_ _0"></span>ts<span class="_ _b"> </span>[66,<span class="_ _e"> </span>64,<span class="_ _e"> </span>38,<span class="_ _e"> </span>46]<span class="_ _b"> </span>is<span class="_ _b"> </span>commonly<span class="_ _b"> </span>used<span class="_ _b"> </span>as<span class="_ _b"> </span>a<span class="_ _b"> </span>prerequisite<span class="_ _b"> </span>for<span class="_ _b"> </span>other</div><div class="t m0 x18 h3 y26 ff2 fs1 fc0 sc0 ls0 ws0">facial<span class="_ _4"> </span>tasks<span class="_ _9"> </span>suc<span class="_ _0"></span>h<span class="_ _9"> </span>as<span class="_ _4"> </span>face<span class="_ _9"> </span>recognition<span class="_ _4"> </span>[59]<span class="_ _9"> </span>and<span class="_ _4"> </span>assists<span class="_ _9"> </span>3D<span class="_ _4"> </span>face<span class="_ _4"> </span>rec<span class="_"> </span>ons<span class="_"> </span>tr<span class="_"> </span>u<span class="_"> </span>ct<span class="_"> </span>i<span class="_"> </span>on<span class="_ _9"> </span>[68,</div><div class="t m0 x18 h3 y27 ff2 fs1 fc0 sc0 ls0 ws0">27]<span class="_ _b"> </span>to<span class="_ _b"> </span>a<span class="_ _b"> </span>great<span class="_ _b"> </span>exten<span class="_ _0"></span>t.<span class="_ _b"> </span>How<span class="_ _0"></span>ev<span class="_ _0"></span>er,<span class="_ _b"> </span>researchers<span class="_ _b"> </span>&#64257;nd<span class="_ _b"> </span>that<span class="_ _b"> </span>2D<span class="_ _b"> </span>alignmen<span class="_ _0"></span>t<span class="_ _b"> </span>has<span class="_ _b"> </span>di&#64259;culties</div><div class="t m0 x18 h3 y28 ff2 fs1 fc0 sc0 ls0 ws0">[65,<span class="_ _e"> </span>30]<span class="_ _8"> </span>in<span class="_ _8"> </span>dealing<span class="_ _8"> </span>with<span class="_ _c"> </span>problems<span class="_ _8"> </span>of<span class="_ _8"> </span>l<span class="_"> </span>arge<span class="_ _c"> </span>p<span class="_ _5"></span>oses<span class="_ _9"> </span>or<span class="_ _c"> </span>occ<span class="_"> </span>lu<span class="_"> </span>si<span class="_"> </span>on<span class="_"> </span>s.<span class="_ _c"> </span>With<span class="_ _8"> </span>the<span class="_ _c"> </span>dev<span class="_ _0"></span>el-</div><div class="t m0 x18 h3 y29 ff2 fs1 fc0 sc0 ls0 ws0">opmen<span class="_ _0"></span>t<span class="_ _4"> </span>of<span class="_ _4"> </span>deep<span class="_ _4"> </span>learning,<span class="_ _6"> </span>many<span class="_ _6"> </span>comput<span class="_"> </span>er<span class="_ _4"> </span>vision<span class="_ _4"> </span>problems<span class="_ _4"> </span>ha<span class="_ _0"></span>ve<span class="_ _6"> </span>b<span class="_ _5"></span>een<span class="_ _6"> </span>well<span class="_ _4"> </span>solv<span class="_ _0"></span>ed</div><div class="t m0 x18 h3 y2a ff2 fs1 fc0 sc0 ls0 ws0">b<span class="_ _0"></span>y<span class="_ _8"> </span>utilizi<span class="_"> </span>ng<span class="_ _c"> </span>Con<span class="_ _0"></span>volution<span class="_ _9"> </span>Neural<span class="_ _c"> </span>Net<span class="_ _0"></span>works<span class="_ _9"> </span>(CNNs).<span class="_ _8"> </span>Thus,<span class="_ _8"> </span>some<span class="_ _8"> </span>works<span class="_ _9"> </span>start<span class="_ _8"> </span>to</div><div class="t m0 x18 h3 y2b ff2 fs1 fc0 sc0 ls0 ws0">use<span class="_ _c"> </span>CNNs<span class="_ _f"> </span>to<span class="_ _c"> </span>estimate<span class="_ _f"> </span>the<span class="_ _c"> </span>3D<span class="_ _f"> </span>Morphable<span class="_ _c"> </span>M<span class="_"> </span>o<span class="_ _5"></span>del<span class="_ _c"> </span>(3DMM)<span class="_ _f"> </span>co<span class="_ _5"></span>e&#64259;cien<span class="_ _0"></span>ts<span class="_ _c"> </span>[32,<span class="_ _e"> </span>67,</div><div class="t m0 x18 h3 y2c ff2 fs1 fc0 sc0 ls0 ws0">47,<span class="_ _e"> </span>39,<span class="_ _e"> </span>48,<span class="_ _10"> </span>40]<span class="_ _6"> </span>or<span class="_ _4"> </span>3D<span class="_ _6"> </span>mo<span class="_ _5"></span>del<span class="_ _b"> </span>warping<span class="_ _6"> </span>functions<span class="_ _4"> </span>[4,<span class="_ _10"> </span>53]<span class="_ _6"> </span>to<span class="_ _4"> </span>restore<span class="_ _6"> </span>the<span class="_ _6"> </span>corres<span class="_"> </span>p<span class="_ _5"></span>onding</div><div class="t m0 x18 h3 y2d ff2 fs1 fc0 sc0 ls0 ws0">3D<span class="_ _9"> </span>informat<span class="_"> </span>i<span class="_"> </span>on<span class="_ _8"> </span>from<span class="_ _8"> </span>a<span class="_ _8"> </span>single<span class="_ _8"> </span>2D<span class="_ _9"> </span>f<span class="_"> </span>aci<span class="_"> </span>al<span class="_ _8"> </span>image,<span class="_ _8"> </span>which<span class="_ _9"> </span>provides<span class="_ _9"> </span>b<span class="_ _5"></span>oth<span class="_ _9"> </span>dense<span class="_ _8"> </span>face</div></div><div class="pi" data-data='{"ctm":[1.568627,0.000000,0.000000,1.568627,0.000000,0.000000]}'></div></div> </body> </html>
评论
    相关推荐
    • CVPR2018-papers
      CVPR2018论文 SSNet:用于在线3D动作预测的量表选择网络 分布式运动平均的超大规模全球SfM 用于局部人脸识别的动态特征学习 人的外观转移 亚特兰大帧估计的全局最优Inlier集最大化 无权域适应的重新加权对抗适应网络...
    • CVPR2018论文合集一
      SLAM DNN CNN 目标检测与目标识别 视频目标分割 图像分割 自然语言处理 自动驾驶
    • feedbackprop:[CVPR 2018]反馈支持
      ,, ,CVPR 2018 有关更多详细信息,请参考或发送电子邮件 抽象的 当部分证据可用时,我们提出了深度卷积神经网络(CNN)的推理程序。 我们的方法由一种基于反馈的通用传播方法(feed-prop)组成,当已知一组不...
    • SIN:CVPR 2018
      CVPR 2018中。( ) 要求:软件 Tensorflow 1.3.0的要求(请参阅: ) 您可能没有的Python软件包: cython , python-opencv , easydict 安装(足够用于演示) 克隆SIN存储库 # Make sure to clone with --...
    • cvpr2018-hnd:CVPR 2018用于视觉对象识别的分层新颖性检测
      booktitle={CVPR}, year={2018} } 依存关系 (或 ) (由于Pytorch经常更新其库,因此如果您使用其他版本,我们的代码将无法工作。) (以mat格式加载功能以用于AWA和CUB实验) (保存特征和随机数) (绘制...
    • CVPR2021-纸面代码解释:cvpr2021cvpr2020cvpr2019cvpr2018cvpr2017论文,极市团队整
      cvpr2021 / cvpr2020 / cvpr2019 / cvpr2018 / cvpr2017(论文/代码/项目/论文阅读) 论文解读摘要: ://bbs.cvmart.net/articles/3031论文分类汇总: : 2000〜2020年历届CVPR最佳论文,解释等汇总: ://bbs.cvmart...
    • TextureGAN:CVPR 2018 TextureGAN与Pytorch实施
      这是我们CVPR2018论文TextureGAN的实现版本。 火车 运行train.py 培训选项为:--dataroot ./datasets/contour2shirt --checkpoints_dir ../checkpoints_pub/textureGAN --loadSize 256 --fineSize 256 --nz 8 --...
    • cvpr2018-notes:CVPR 2018大会的笔记。 链接到论文,思想和研究方向
      CVPR 2018笔记 以下是我在大会期间遇到的最有趣的论文,挑战和研讨会的链接,注释和想法。 但是,您可能希望通过回顾所有979篇被接受的论文并逐步浏览所有21个教程和48个研讨会的内容,来形成自己对计算机视觉和模式...
    • dsrn:CVPR 2018论文的实施
      通过双态循环神经网络实现图像超分辨率 (CVPR 2018) 引文 @inproceedings{han2018image, title={Image super-resolution via dual-state recurrent networks}, author={Han, Wei and Chang, Shiyu and Liu, Ding...
    • CVPR2018论文合集三
      CVPR论文合集--SLAM DNN CNN 目标检测与目标识别 视频目标分割 图像分割 自然语言处理 自动驾驶