上传本地文件到远程服务器
文章信息
创建日期:2024年12月25日
用GPT写了个脚本,方便以后上传Docusaurus生成的静态文件。
import os
import paramiko
import shutil
from scp import SCPClient, SCPException
# 远程服务器信息
hostname = '服务器ip或域名'
port = 22
username = '用户名'
password = '密码'
# 本地文件夹路径
local_folder = r'D:\工具\node-v22.12.0-win-x64\ifdess.cn\build'
# 远程文件夹路径
remote_folder = '/opt/1panel/apps/openresty/openresty/www/sites/ifdess.cn/index/build'
# 压缩后的文件名
local_archive = r"D:\工具\node-v22.12.0-win-x64\ifdess.cn\build.tar.gz"
# 全局统计变量
total_files = 0
successful_uploads = 0
failed_uploads = 0
def create_ssh_client(hostname, port, username, password):
"""建立SSH连接"""
try:
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(hostname, port=port, username=username, password=password)
print(f"成功连接到 {hostname}")
return client
except Exception as e:
print(f"SSH 连接失败: {e}")
exit(1)
def compress_folder(local_folder, archive_name):
"""压缩文件夹"""
try:
# 确保文件名没有重复的后缀
if archive_name.endswith('.tar.gz'):
shutil.make_archive(archive_name[:-7], 'gztar', local_folder) # 去掉 .tar.gz 后缀
else:
shutil.make_archive(archive_name, 'gztar', local_folder)
print(f"文件夹 {local_folder} 已压缩为 {archive_name}")
except Exception as e:
print(f"压缩文件夹时出错: {e}")
exit(1)
def upload_file(scp, local_file, remote_file):
"""上传单个文件"""
global total_files, successful_uploads, failed_uploads
total_files += 1
try:
print(f"[上传中] {local_file} -> {remote_file}")
scp.put(local_file, remote_file)
print(f"[上传成功] {local_file} -> {remote_file}")
successful_uploads += 1
except SCPException as e:
print(f"[上传失败] {local_file} -> {remote_file}: {e}")
failed_uploads += 1
def upload_and_extract(local_archive, remote_folder):
"""上传压缩包并解压"""
ssh_client = create_ssh_client(hostname, port, username, password)
scp = SCPClient(ssh_client.get_transport())
try:
# 上传压缩包
remote_archive = os.path.join(remote_folder, os.path.basename(local_archive)).replace("\\", "/")
upload_file(scp, local_archive, remote_archive)
# 解压文件
remote_extract_dir = os.path.join(remote_folder, os.path.splitext(os.path.basename(local_archive))[0]).replace("\\", "/")
extract_command = f"tar -xzvf {remote_archive} -C {remote_folder}"
stdin, stdout, stderr = ssh_client.exec_command(extract_command)
stdout.channel.recv_exit_status() # 等待命令完成
print(f"已在远程服务器解压 {remote_archive} 到 {remote_extract_dir}")
# 删除压缩包
delete_command = f"rm -f {remote_archive}"
stdin, stdout, stderr = ssh_client.exec_command(delete_command)
stdout.channel.recv_exit_status()
print(f"已删除远程服务器上的压缩包 {remote_archive}")
except Exception as e:
print(f"上传或解压过程中发生错误: {e}")
finally:
scp.close()
ssh_client.close()
print(f"已断开与 {hostname} 的连接")
# 执行压缩
compress_folder(local_folder, local_archive)
# 执行上传和解压
upload_and_extract(local_archive, remote_folder)
# 输出上传统计信息
print("\n上传完成")
print(f"总文件数: {total_files}")
print(f"成功上传: {successful_uploads}")
print(f"失败上传: {failed_uploads}")
GPT用来写脚本还是很完美的,基本上不需要过多调试,逻辑也挺清晰的。然后就是要注意一点,向服务器上传文件的时候,最好是以压缩包的形式,再在服务器执行解压。如果直接上传很零散的文件,会很慢,线程多了以后又容易造成堵塞从而上传失败。上面的这个脚本,实际使用的话3秒不到就执行完成了。