SSH into LXC and install files via JAVA API
location: linuxexchange.com - date: June 16, 2015
I am trying to use libvirt's JAVA API to ssh into a domain and install some applications. Using virsh, it's pretty straight forward using the command line:
virsh -c lxc:/// create myguest.xml
virsh -c lxc:/// console myguest
virsh -c lxc:/// lxc-enter-namespace myguest -- sudo yum install <application>
But in order to incorporate the above functionality in a JAVA API, I am not really sure what to do. The following command helps me ssh into a system, but I am not sure how to send remote commands.
conn = Connect("lxc+ssh:///")
Please can someone help me out with this?
The only other solution I could find is using JAVA SSH library JCraft to ssh into the container.
Cannot ssh into my server via internet. What am I missing?
location: ubuntuforums.com - date: April 20, 2012
So I've been experimenting with ssh-server on an old computer I've got. Following the ubuntu ssh configuration wiki, I can ssh into it just fine via my local wlan, but when I try to use my public ip address I just get a timeout error and it seems that the request is not making it through at all to my server from what I can see in the ssh log file. I have ssh-server configured to work on an arbitrary port and I'm specifying exactly that port when trying to log in.
I'm pretty sure the source of the problem is the port forwarding on my router, but I'm almost certain I've set it up correctly. I've attached some pictures to show what I'm doing to setup the port forwarding. Can someone tell me if there is something fundamental that I am missing?
I realize that trying to ssh into a server via the internet from the same lan can sometimes upset the router, but I've been trying to ssh in via the wireless hotspot on my phone. I can ssh into my university account, but when I try to ssh into
rsync via ssh has overwritten some files and I'm looking to restore them
- date: August 27, 2010
Hello. I have two disks I have been working with one on my MacBook Pro (HFS+) and one connected to a Linux machine (FAT - this is an external hard drive). Recently I started exploring backing up my files with rsync over an ssh from my Mac to the external hard drive connected to the Linux machine. I did this once successfully and ported all of my Mac files to the external. However, I did something incorrectly the second time and I believe I may have done rsync in the wrong direction and it has overwritten some files on my Mac (configuration files for iCal that had events stored, an iPhone program I've been working on). The full rsync command I used was the following:
rsync -avz -e ssh [email protected]
However, I was under the impression rsync wouldn't touch files that are timestamped at a newer date so even if it was copying from the linux external to the mac hard drive why would it overwrite newer files??
I have recently downloaded scalpel
Transferring Files via Reverse SSH
location: linuxquestions.com - date: January 21, 2011
I currently have clients to where I have them reverse ssh into my company ssh server and then I just simply connect to them via a tunnel to my ssh server.
Install git via SSH
- date: April 18, 2010
I'm running XBMCLive, which is running on top of Ubuntu. When attempting to install git through the command line, apt-get doesn't seem to have the correct source added/enabled:
:~$ sudo apt-get install git-core
[sudo] password for xbmc:
Reading package lists... Done
Building dependency tree
Reading state information... Done
E: Couldn't find package git-core
What source do I need to add, and how can I do this from the command line? Thanks!
Program to *move* files via SSH over network?
location: ubuntuforums.com - date: October 24, 2009
This is not an important issue, but I'm wondering if there are any applications that I can use to *move* a file from one system to another, over the network, using SSH. I use WinSCP on my Windows system at work to do this with the F6 ('move') command. What this does (I'm guessing) is copy the file, then remove the local copy.
On Ubuntu, use gftp and sftp (command-line) regularly to transfer files via SSH, but it bothers me (slightly) to have to remember to delete the local copies when I'm finished transferring. (I hate ending up with duplicate files)
I don't believe I can use the nautilus SSH connection option because I have public key authentication set up, and there doesn't seem to be an option to point at my identity file in the UI.
I've also used scp a few times in the past, but don't believe this can move rather than copy.
Transferring remote files via ssh,find,cpio locally
location: linuxquestions.com - date: November 20, 2011
I am trying to transfer my website using ssh(hostbasedauthentication) using:
graphically edit remote files via SSH
location: ubuntuforums.com - date: August 12, 2008
On Dapper, (Edgy, Feisty, Gutsy too AFAIK) it was possible to use the Connect to Server applet to edit remote files over SSH using a text editor like gedit.
All you had to do, once the remote folder was mounted, was click on the remote file, supply the password again and then edit and save directly. However, this invaluable functionality sems to be lost in Hardy. On the last step, the password prompt simply comes back up again.
A couple of bug reports are quietly gathering dust on launchpad but there seem to be no solutions. Any ideas, workarounds, alternatives?
Can't SSH into Ubuntu 12.04 Install
- date: October 28, 2013
I'm having problems logging into a 12.04 desktop with SSH, from within the local network. Attempt results in:
>ssh -vv [email protected]
OpenSSH_4.3p2, OpenSSL 0.9.8e-fips-rhel5 01 Jul 2008
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: Applying options for *
debug2: ssh_connect: needpriv 0
debug1: Connecting to xxx.xx.xxx.xxx [xxx.xx.xxx.xxx] port 22.
debug1: connect to address xxx.xx.xxx.xxx port 22: Connection timed out
ssh: connect to host xxx.xx.xxx.xxx port 22: Connection timed out
I have been using linux for a while now, so here is what I understand to be the pertinent information:
I CAN ping the system.
Openssh-server is installed and SSHD IS running.
There are NO firewall rules active (ufw or iptables).
I CAN login to the localhost.
Port 22 IS open and listening.
>nmap -sV -p 22 xxx.xx.xxx.xx
Host is up (0.000025s latency).
PORT STATE SERVICE VERSION
22/tcp open ssh OpenSSH 5.9p1 Debian 5ubuntu1.1 (protocol 2.0)
Archiving many(>100000) files into a single file via cli
location: ubuntuforums.com - date: March 4, 2011
I'm trying to zip or rar >100000 files into a single file so that I can upload it to my server much faster than ftp downloaded it. Total they're all only 4gb, but because of the number of files Nautilus freezes just opening the folder they're in. They're all .jpgs and all in the same folder and I've tried a few commands but I keep getting error messages.
Anyone have a command that will archive all the files from a folder into a single zip (or rar, tar etc)? I can't just archive the folder because then I would have to move all the files out of that folder and just opening the folder to move them would crash it, and I don't have ssh into that server.
Page: 1 2 3 4 5 6 7 8 9 10