List:General Discussion« Previous MessageNext Message »
From:Jan Dvorak Date:September 12 2000 3:23pm
Subject:Re: Mysql/DBI design question.
View as plain text  
Hi,

Richard Reina wrote:
> 
> I am writing a fairly involved database application in perl/Mysql.  The
> user interface is written in a module of perl called perlmenu which runs
> on top of Curses.  So far I've written a couple dozen scripts that allow
> the user to interact with various Mysql tables.  I have set of a LAN and
> now have to get this app. running on it but am unsure how to proceed.
> 
> I would like for any user to be able to log on to any machine and run
> the program.
> Should all the scripts be stored and executed from the
> server?

You can store the scripts on every machine,
or in a central place in a shared directory
that all machines mount.
Deployment is easier in the latter case.

> How should users execute them?   Through remote login (what
> happens if two users try to execute the same script at the same time?),
> or should I set it up as an Network Information Server?

It's o.k. if two users execute the same script at the same time.
It's like having two processes executing one binary.

You'll better utilize the computing power of your workstations
if the scripts are executed on them,
but if the scripts were light weight,
it wouldn't matter.

> Is there a
> general design blueprint that I should follow?

As a rule, the Client/Server architecture
has one database server in the center
and the clients run on other machines.
The primary purpose of the server is to safely handle
the data manipulations the clients request,
as well as to take care of backups & stuff.

However, with Unix, it's mostly due to scalability considerations
that the clients are run on other machines.

> Any help or Ideas would be greatly appreciated.
> 
> Richard

Jan
Thread
Mysql/DBI design question.Richard Reina12 Sep
  • Re: Mysql/DBI design question.Jan Dvorak12 Sep
  • R: Mysql/DBI design question.Massimo Trojani12 Sep