service:此仓库将显示有关如何通过WSGI Web服务器部署python服务的示例之一

  • K9_606513
    了解作者
  • 166.9KB
    文件大小
  • zip
    文件格式
  • 0
    收藏次数
  • VIP专享
    资源类型
  • 0
    下载次数
  • 2022-06-07 20:42
    上传日期
一个示例服务-如何在生产中部署一个简单的基于python的服务/ api? 在本文我们已经看到了如何创建基于高性能的异步休息服务。 我们在性能方面也进行了比较。 但是在这篇特别的文章中,我们将只关注大多数时间我们实际上编写的那些简单服务,以及它们的性能如何根据选择正确的部署策略而有所不同。 如您所知,如果您打算编写基于休息的服务,则可以选择Flask或Falcon框架(可能有10次选择8次),直到并且除非您有任何先前的特定要求。 但是问题是,您如何实际在生产中服务或部署这些产品? 您应该使用其内置的App服务器还是from wsgiref import simple_server 。 我们都知道,任何通过它支持WSGI标准的Web服务器都将能够部署服务/ api。 但是问题是,这样足以应付生产中的多个请求吗? 答案是不”。 根据Flask / Falcon文档,建议仅使用gunico
service-master.zip
  • service-master
  • Dockerfile
    135B
  • README.md
    3.7KB
  • InfoSvc.py
    1KB
  • .github
  • workflows
  • service_workflow.yml
    525B
  • Util.py
    154B
  • requirements.txt
    62B
  • images
  • client_output.png
    64.4KB
  • dev_service_ouput.png
    68.6KB
  • gunicorn_output.png
    52.8KB
  • start.sh
    37B
  • client.py
    587B
  • wsgi.py
    65B
内容介绍
# A sample service - How to deploy one simple python based service/api in production ? In this article - [https://github.com/sughosneo/async-rest](https://github.com/sughosneo/async-rest) we have seen how we can create highly performance oriented async based rest services. We have done comparison in terms of performance as well. But in this particular article we would only focus on the those simple services which we actually write most of the times and how it's performance varies based on just choosing the right deployment strategy. As you are aware if you are planning to write a rest based service either you would choose a Flask or Falcon framework possibly 8 out 10 times, until and unless you have any prior specific requirements in your mind. But the question is how do you actually serve or deploy those in production ? Should you use their in built App server or ```from wsgiref import simple_server```. We all know that any web server which supports WSGI standard through that we would be able to deploy the service/api. But the question is would that be good enough to handle multiple request in production ? The answer is "NO". As per the Flask/Falcon documentation it's recommended that someone should only use ***```gunicorn or uWSGI```*** And for the proxy server Nginx would definitely be preferable. In your mind if the question arise - what could possibly the difference. You can run below steps to check it out of your own. User can follow below steps to run this ***Info*** Service within docker. ***```With gunicorn```*** ------------------------- ```python git clone https://github.com/sughosneo/service.git cd service docker build . -t service:1.0 docker run -p 8000:8000 service:1.0 & ``` Once you run those above commands you would notice that below output is coming up : ![gunicorn server ouput](./images/gunicorn_output.png) The sample client in this project actually based on the asyncio module and would only work with Python 3.7 onwards. So if you don't have the local environment not configured to run the below ```python3 client.py```. So better you get into the working container itself, that would help you to run the client with ease. For that you can use below commands : ```python docker exec -it 0e315e064cf5 /bin/bash python3 client.py ``` Once you run the client you would notice below output. ![client ouput](./images/client_output.png) ***```With wsgiref http simple server ```*** -------------------------------------------- We are not creating one separate Dockerfile for this direct service to test. We would directly get into the container and then run the InfoSvc.py ```python docker exec -it 0e315e064cf5 /bin/bash python3 InfoSvc.py ``` Then you would require to change the configuration of the ***client.py*** as by default with simple http server it runs on ***port 5000***. ![service side ouput](./images/dev_service_ouput.png) You would notice carefully the time this default web server is taking to process those 100 request. It's much slower than the previous one. ***```Conclusion```*** ------------------- Both the scenarios server was processing the request synchronously. But in the gunicorn approach it actually generates few of the work threads to process the incoming request and in the second scenario single thread itself was trying to process all the request. Though gunicorn is also supports async workers. But by design it's synchronous. So don't worry if you are running something synchronous. It's not bad always. If your design and requirements supports that then you can always develop some service of this sort and deploy it with correct strategy in place.
评论
    相关推荐
    • web服务器
      简单的web服务器 ,通过程序可以建立一个web服务器平台,该程序我已经调试通过请放心使用,用户使用手册已经放在里面了。
    • ASP WEB服务器
      紫雨轩ASP WEB服务器 可以在本机搭建web服务器,很好用的。。
    • web服务器温度
      web服务器温度
    • 易语言WEB服务器
      易语言WEB服务器
    • web代理服务器
      实现了基本的web服务器的功能,可以帮你更好的了解web服务器的基本工作原理
    • Web服务器案例
      Web服务器案例 Web服务器案例 Web服务器案例 Web服务器案例
    • 易语言web服务器
      易语言web服务器源码系统结构:创建窗口,窗口函数,菜单处理,客户进入,客户离开,数据到达,协议处理,获取用户数据,删除用户数据,LogAdd,创建完毕,创建托盘图标,弹出托盘气泡,删除图盘图
    • web服务器编程
      只是一个基于web service服务器编程的教学代码,简单易懂
    • 简易Web服务器
      简易Web服务器资料,供大家一起参考学习。
    • 易语言web服务器
      易语言web服务器系统结构:消息替换, ======窗口程序集1 || ||------_透明标签3_鼠标左键被按下 || ||------_透明标签2_鼠标左键被按下 || ||------__启动窗口_创建完毕 || ||------_服务器1_客户进入