This loads a font easier to read for people with dyslexia.
This renders the document in high contrast mode.
This renders the document as white on black
This can help those with trouble processing rapid screen movements.

Re: [issue8240] gridftp on galaxy.ivec.org

From: <Chris.Phillips_at_email.protected>
Date: Thu, 17 Jul 2014 03:23:46 +0000

Hi all

I have just found a really nasty issue with galaxy-data.ivec.org.

The hostname resolve to 2 different IP addresses:

        host galaxy-data.pawsey.ivec.org
        galaxy-data.pawsey.ivec.org is an alias for galaxydata.pawsey.ivec.org.
        galaxydata.pawsey.ivec.org has address 146.118.80.66
        galaxydata.pawsey.ivec.org has address 146.118.80.67

These seem to be two separate hosts. This may not matter for gridftp, but if you are running your own software to receive data you need to match the correct IP (as I just discovered). This might also explain Stuart’s problem getting gridftp to work all the time.

Cheers
Chris



On 1 Jul 2014, at 4:12 pm, Stuart Weston <stuart.weston_at_email.protected> wrote:

> I take it back, all running now :-)
>
> seemed it took a while for those DNS names to get recoginised this end.
>
> aust27 is going to cortex. all others will now goto galaxy, sending v252ar at the moment.
>
> Stuart Weston
>
> Mobile: 021 713062
> Skype: stuart.d.weston
> Email: stuart.weston_at_email.protected
> http://www.atnf.csiro.au/people/Stuart.Weston/index.html
>
> Software Engineer
> Institute for Radio Astronomy & Space Research (IRASR)
> School of Computing & Mathematical Sciences
> Faculty of Creative Technologies
> Auckland University of Technology, New Zealand.
> http://www.irasr.aut.ac.nz/
>
> ________________________________________
> From: owner-vlbiobs_at_email.protected
> Sent: 01 July 2014 16:35
> To: vlbiobs_at_email.protected
> Subject: Fwd: [issue8240] gridftp on galaxy.ivec.org
>
> hi all,
>
> after some delay we finally have access to our new data storage area
> at iVEC (see below). It should be possible for all sites to resume
> transfers immediately to the new machine: galaxy-data.pawsey.ivec.org
>
> Note the changed host name above - this is not the usual login node
> for galaxy. It is solely for data transfers.
>
> Please put data in the following area, using the same naming
> convention we used on cortex:
>
> galaxy-data.pawsey.ivec.org:/scratch/director831/transfers/2014/
>
> Remember to set your umask to 002 in your .bashrc.
>
> Please test if this works for you ASAP in case there are residual
> firewall issues.
>
> Note that the new storage area does not have the redundant backups we
> had on cortex, though the data should still be more secure than on a
> Mark5 diskpack or xraid. However, for safety we would encourage sites
> to hold a copy of the data until correlation has completed wherever
> possible. You can monitor whether experiments have been released on
> the correlator wiki pages:
>
> LBA:
> http://cira.ivec.org/dokuwiki/doku.php/correlator/records
>
> Auscope:
> https://docs.google.com/spreadsheet/pub?key=0Al7YbLo5pvvidEtwWkJBUF9OVHFvc1JLOFMxd1J2UHc&output=html
>
> regards,
> Cormac
>
>
> ---------- Forwarded message ----------
> From: Andrew Elwell "iVEC Help" <issue_tracker_at_email.protected
> Date: 1 July 2014 10:12
> Subject: [issue8240] gridftp on galaxy.ivec.org
> To: c.reynolds_at_email.protected
> stuart.weston_at_email.protected
>
>
>
> Andrew Elwell <andrew.elwell_at_email.protected
>
> Hi Cormac,
>
> We believe that globus-gridftp-server is now configured on both galaxy
> data mover nodes for
> sshftp: urls. The hostnames are galaxy-data1.pawsey.ivec.org /
> galaxy-data2.pawsey.ivec.org
> (which are part of the DNS round-robin address of galaxy-data.pawsey.ivec.org).
>
> We've had to downgrade the server part to globus 5.0.5 for this --
> version 5.2.4 doesn't work
> with sshftp at all. Sorry for the delay while we were debugging this!
> - the globus-url-copy
> from 5.2.4 (default) works fine with it.
>
> I've tested transfers to / from galaxy-data from another couple of
> hosts, but can you check it
> works as part of your workflow?
>
> FYI, we've also installed it on Magnus in a similar manner on
> magnus-data (consisting of
> magnus-data1, magnus-data2) if you need to copy data to/from that system too.
>
>
> Andrew
>
>
> --
> ----------------------------------------------------
> Cormac Reynolds
> Phone: +61 8 9266 3785
> Fax: +61 8 9266 9110
> email: c.reynolds_at_email.protected
> ----------------------------------------------------
Received on 2014-07-17 13:23:48