Deploy A Python Script in Vercel & Supabase

200.0 GBP

200.0 GBP peopleperhour 技术与编程 海外
709天前

详细信息

We have an existing python script which we would like to deploy via Vercel / Supbase.
We are looking for an engineer experienced with Vercel and python to deploy.
We will grant access to our Vrcel account and provide the specific python script.
We would expect this to take less than 1 day for an experienced developer.

免责声明

该外包需求信息来源于站外平台,本站仅提供公开信息部分字段展示与订阅服务,更多请查看免责声明

关注公众号,不定期副业成功案例分享
关注公众号

不定期副业成功案例分享

领先一步获取最新的外包任务吗?

立即订阅

类似推荐

Experience Level: Expert I'm looking for a Developer/s to continue the work of other Developers as our needs increase. I’ve advertised before, but do put an offer in if I didn’t accept last time. The software is Python Scrapy based. These are some of the links for the websites I need scraping. I only need the information as attached in the spreadsheet. It has to be in the format shown as all the sheets are linked together - https://merge-csv.com/ I have IDOX and ASPX software developed, so potentially a continuation of what works already would be the most efficient approach. Let me know possible timescales. Most of the following run on ASPX https://planweb01.rother.gov.uk/OcellaWeb/planningSearch https://planning.agileapplications.co.uk/tmbc/search-applications https://plantech.centralbedfordshire.gov.uk/PLANTECH/DCWebPages/acolnetcgi.gov?ACTION=UNWRAP&RIPNAME=Root.pgesearch https://data.whitehorsedc.gov.uk/java/support/Main.jsp?MODULE=ApplicationCriteria&TYPE=Application https://planningregister.cherwell.gov.uk/Search/Advanced this one was running but now there is a human test on the site https://eppingforestdc.my.site.com/pr/s/planning-application/a0hTv000004boMwIAI/epf260024?c__r=Arcus_BE_Public_Register&tabset-dc51c=2 The following one is run on IDOX which I can normally get going myself, but these won't run on the codes I have. https://pa.sevenoaks.gov.uk/online-applications/search.do?action=simple And these are links to the site that needs EasyOCRParsing work doing to extract data from online forms. The scraper works already. https://developmentandhousing.hackney.gov.uk/planning/index.html?fa=search https://eppingforestdc.my.site.com/pr/s/planning-application/a0hTv000004boMwIAI/epf260024?c__r=Arcus_BE_Public_Register&tabset-dc51c=2 Thanks for looking. Any developers interested I will send a link to the developed software folders. Let me know if this is something you are able to do with an idea of costs.
25.0 GBP 技术与编程 peopleperhour 海外
1天前
TowDispatch is a full-featured platform designed to streamline operations for car towing agencies. The system supports both web and mobile environments and enables seamless coordination between dispatch administrators and field drivers. Tech Stack Web Frontend: React.js + Tailwind CSS Backend & Realtime DB: Supabase (PostgreSQL, Auth, Edge Functions) Mobile Application: Capacitor (for native app packaging), integrated via Android Studio for Android and planned iOS deployment Core Functionality Admin Dashboard Job Management: Create, assign, update, and delete tow jobs Driver Management: Add, edit, and monitor driver status and activities Reporting Module: Generate operational and performance reports Notification System: Real-time job status feedback via toast messages (web) and push notifications (mobile) Driver App Job Inbox: View assigned jobs with real-time updates Job Workflow: Update job progress (e.g., en route, job complete) Profile: Manage driver-specific settings and preferences Current Challenge: Push Notification Implementation With the mobile app version now fully operational through Capacitor and Android Studio, the primary challenge lies in implementing a reliable and scalable push notification system to replicate and extend the existing toast-based web notification model. Existing Web Notification Flow Notifications are triggered via toast messages, displayed in-app for real-time user feedback All notification events (e.g., job assignment, completion) are stored in Supabase for traceability
237.0 GBP 技术与编程 peopleperhour 海外
9小时前
I have two files- the driver file was @2.2mm records of consumers in the US- it contains information from estimates they received from a home improvement contractor- those data points include things like zip code, state, type of services, if sold or not, if financed or not, salespersons name, office, etc.... The second file is the results file that contains @1.6mm records- the results file is driver file where a credit file hit was matched. The results file has credit score, DTI, income, property value, cltv, equity position and a bunch of other credit data. this information is anonymized so no PII. I am looking for someone to take this data and drive valuable insights by assessing both files individually, conversion rate, by sales person, average sale price, etc... on the driver file at a zip, agent and office level. Then perform a similar exercise on the results file to see what credit demographics exists at a zip level, also agent and office level. then merge the data at each levelt o create success metrics and insights from the combination of sales and credit information. It can be as simple as a glorified pivot table up to a more in depth analysis to include some weighted scores and enriched data from other sources like census data. I can work in an iterative manner and spend time with the consultant to refine results daily with new marching orders until it is complete. I may have other on-going work for the right person. I can provide detailed layouts, results and cuts that need to be performed. I would prefer someone with some knowledge of credit demographic data or they may not be able to contribute or the learning curve may be more dramatic.
250.0 USD 技术与编程 peopleperhour 海外
2天前