In Sisense Linux distribution you are able to upload files to the local environment using Droppy, which is embedded in the Sisense platform.
Power-Shell 1 file upload Example
The script below will allow the user to upload files programmatically, using REST command. The script itself is written in Powershell.
Change the first 4 parameters
dns - The URL you use for your Sisense site
RemoteLocation - The location of the target file within Droppy
FilePath - The local location of the file from the Local Server
AUTH_TOEN - Generate a token and replace it with <token>
$dns= 'https://myhost.sisense.com' $RemoteLocation= 'data/Test' $FilePath = 'C:\myfolder\myfile.csv'; $AUTH_TOKEN = 'Bearer <Token>' ############################################################### $pos = $FilePath.LastIndexOf("\") $URL = $dns+'/app/explore/!/upload?vId=0&rename=0&to=/'+$RemoteLocation; $filename = $FilePath.Substring($pos+1) $fileBytes = [System.IO.File]::ReadAllBytes($FilePath); $fileEnc = [System.Text.Encoding]::GetEncoding('UTF-8').GetString($fileBytes); $boundary = [System.Guid]::NewGuid().ToString(); $LF = "`r`n"; $Headers = @{'Authorization'=$AUTH_TOKEN}; $bodyLines = ( "--$boundary", "Content-Disposition: form-data; name=`"filename`"; filename=`"$filename`"", "Content-Type: application/octet-stream$LF", $fileEnc, "--$boundary--$LF" ) -join $LF Invoke-RestMethod -Uri $URL -Method Post -ContentType "multipart/form-data; boundary=`"$boundary`"" -headers $Headers -Body $bodyLines
Python Example for Folder Syncing (Upload only)
In this example you can set a local folder with sub-folders to sync with a File Management virtual library.
First run will upload and update all files and create a lastRunTime.json file for reference of last run.
On the next run only files that the update date is larger than the last run time of the application will be uploaded/updated
[DEFAULT] # Sisense URL host = https://test.sisense.com # Path to store files on target machine remoteLocation= data/Test # Path to folder or file which should be transferred # If folder is specifeed, all subfolders will be also proccessed path = C:\Users\Documents\Test\ # Sisense API token token = Bearer TOKEN # Script will store last modified time for each processed file. This will allow it to upload only modified files on next executions lastRunFilename = lastRunTime.json
import uuid import requests import io import json import os import time from datetime import datetime import configparser config = configparser.ConfigParser() config.read('config.ini') lastRunFilename = config.get('DEFAULT','lastRunFilename') host = config.get('DEFAULT','host') remoteLocation = config.get('DEFAULT','remoteLocation') path = config.get('DEFAULT','path') token = config.get('DEFAULT','token') # Cut slashes at the end of paths if they are exist if path[-1] == '\\': path = path[:-1] if remoteLocation[-1] == '/': remoteLocation = remoteLocation[:-1] # Function to compare stored and file last modified time def compareModificationTime(filePath): if (filePath in lastRunTime and os.path.getmtime(filePath)>int(lastRunTime.get(filePath))) or (filePath not in lastRunTime): return True else: return False # Function to upload file to Droppy via Sisense API def uploadFile (filesNames, folderPath): if os.path.isfile(path): globalFolderPath = os.path.dirname(path) else: globalFolderPath = path if folderPath != globalFolderPath: remotePath = remoteLocation + folderPath.replace(globalFolderPath,'') else: remotePath = remoteLocation remotePath = remotePath.replace('\\','/') url = host +'/app/explore/!/upload?vId=0&rename=0&to=/'+remotePath files = [] header = {'Authorization': token} notSentFiles = [] for filename in filesNames: filePath = '%s\\%s'%(folderPath, filename) if compareModificationTime(filePath): files.append( ("file", ( open(filePath, "rb")))) lastRunTime [filePath] = int(time.time()) else: notSentFiles.append(filePath) if files: r = requests.post(url, files=files, headers=header) for filename in filesNames: print(remotePath+'/'+filename, r.status_code, r.reason) return notSentFiles try: f = open(lastRunFilename, "r") timestamp = f.read() lastRunTime = json.loads(timestamp) except: lastRunTime = {} notSentFiles=[] if ( os.path.isdir(path)): for root, subdirs, files in os.walk(path): if files: notSentFiles += uploadFile (files, root) time.sleep(0.1) elif ( os.path.isfile(path)): notSentFiles += uploadFile ([os.path.basename(path)], path.rsplit('\\', 1)[0]) elif not os.path.exists(path): print ("Specified path doesn't exist") if notSentFiles: print ('These files were not uploaded because they were not modified:') for file in notSentFiles: print (file) with open(lastRunFilename, 'w') as f: json.dump(lastRunTime, f)