tag:blogger.com,1999:blog-4527276399966185267.post1724491998252308529..comments2024-03-06T20:16:30.599-08:00Comments on Diethard Steiner on Business Intelligence: Pentaho Data Integration: Remote execution with CarteAnonymoushttp://www.blogger.com/profile/05683544764949933581noreply@blogger.comBlogger17125tag:blogger.com,1999:blog-4527276399966185267.post-68696053666917257572014-04-16T03:00:43.100-07:002014-04-16T03:00:43.100-07:00Hi .. Great blog!!
I am facing with one issue : Ho...Hi .. Great blog!!<br />I am facing with one issue : How to get the new job entry's into carte web server's ?<br />Is their any way to make the entry's dynamic ??<br /><br /> Anonymoushttps://www.blogger.com/profile/15312287489769084671noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-3098795493065149722013-11-24T12:48:31.857-08:002013-11-24T12:48:31.857-08:00Thanks for your feedback Max! Not that I am aware ...Thanks for your feedback Max! Not that I am aware of, but you could just use the sftp command instead to put the file onto the remote server and then you can just call the job as a web service (see my recent blog post) or start it via the carte web interface.Anonymoushttps://www.blogger.com/profile/05683544764949933581noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-38528944341256953422013-11-24T11:03:58.104-08:002013-11-24T11:03:58.104-08:00Hi Diethard,
Great blog, I always enjoy reading t...Hi Diethard,<br /><br />Great blog, I always enjoy reading the articles about PDI when looking for some tipps and tricks :)<br />Regarding this post: is there also a way to send a job to an already running Carte server via command line?<br /><br />Thanks for any hint!<br /><br />Max<br />Anonymoushttps://www.blogger.com/profile/09308904724039752804noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-46972671720074972842013-08-20T01:07:39.454-07:002013-08-20T01:07:39.454-07:00Hi Johan, Thanks a lot for your feedback. Yes inde...Hi Johan, Thanks a lot for your feedback. Yes indeed, I have my project set up in a similar fashion. Maybe at some point I'll write a blog post about general project setup to discuss this topic a bit in more detail.Anonymoushttps://www.blogger.com/profile/05683544764949933581noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-7121054797765513012013-08-19T23:47:55.422-07:002013-08-19T23:47:55.422-07:00A good Git setup will help you here.. If you have ...A good Git setup will help you here.. If you have a repo on, say, GitHub or Bitbucket, with all your jobs / transformations, that's cloned on the production server, you can do development work locally, and push it to the main repo when you're done. Then you go to the production server and just do a pull to get all the latest changes.<br /><br />For a more advanced setup, dedicate a branch to production code, eg. "prod". Then do your development on the regular "master" branch and push all you like to GitHub / Bitbucket without worrying about messing with the production server. Once you're ready to release, you merge the changes on master to the "prod" branch, and then do a pull on the server.<br /><br />Taking it one step further, you can have a script running every 5 mins or so in crontab on the server, that will automatically pull any changes to "prod" down to the server. Now you've got automatic deployment -- but make sure you've tested your changes properly!<br /><br />With all this, it's of course important to have all parameters like database connections etc parameterized, probably using a KETTLE_HOME that's actually inside the Git repo, that's different for prod and dev, and with your parameters in .kettle/kettle.properties so they get automatically loaded. Then you just use wrapper scripts around kitchen.sh and pan.sh to run your jobs / transformations on the server, that set KETTLE_HOME to the prod version. If running locally, you can have different wrapper scripts, or set it manually, for example.<br /><br />You're probably already doing many of these things, so sorry if I'm rambling.. I hope you find something useful here though :)Anonymoushttps://www.blogger.com/profile/00165805562595964563noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-25924677544066122762013-06-19T12:38:56.862-07:002013-06-19T12:38:56.862-07:00Thanks, Diethard. I think your framework would sti...Thanks, Diethard. I think your framework would still be a better option than running though XServer.Daniel NWUhttps://www.blogger.com/profile/15196678050887822327noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-82060314600923974692013-06-19T12:15:00.013-07:002013-06-19T12:15:00.013-07:00Ok, I see. Not really ... not that I am aware of a...Ok, I see. Not really ... not that I am aware of at least. I usually develop and test locally (or test server) and only then make the transformations/jobs available on the prod server. Anonymoushttps://www.blogger.com/profile/05683544764949933581noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-50509236156645859162013-06-19T12:01:32.442-07:002013-06-19T12:01:32.442-07:00To make it simple,
I have been using ssh to the s...To make it simple,<br /><br />I have been using ssh to the server with -X option (e.g. ssh -X user@hostname) from my client. And run the spoon.sh from the command line, which forwards the Spoon GUI to my client. I then execute or create jobs/transformations, which will run or save the jobs/transformations on the server side. <br /><br />I there an equivalent way of doing it through Carte? That is, can my Spoon on my client directly execute or create jobs/transformations on the server side?<br /><br />Hopefully, this makes more sense to you.<br /><br />Thanks again.<br />Daniel Daniel NWUhttps://www.blogger.com/profile/15196678050887822327noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-8287333717159335282013-06-19T10:37:48.375-07:002013-06-19T10:37:48.375-07:00I am not quite sure I understand your question: So...I am not quite sure I understand your question: So you want to put the files on your server and schedule them? If you have ssh access to your server, you can just run the jobs/transformation using pan.sh or kitchen.sh (as you mentioned above). To put the files on your server, you can just use sftp. And you have cron for scheduling.<br /><br />Unrelated: In Spoon, when you specify to execute the job/transformation on the server (carte), you can also specify that it should execute the file there (as opposed to your client), in which case it will copy the file across. But this is mainly useful for testing.<br /><br />Anonymoushttps://www.blogger.com/profile/05683544764949933581noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-64826288061659573402013-06-19T10:26:37.200-07:002013-06-19T10:26:37.200-07:00Thanks for the great tutorial. I have one burning ...Thanks for the great tutorial. I have one burning question though.<br /><br />Your tutorial is running jobs/transformation saved in the client-side.<br />However, is it feasible to run spoon.sh on my client computer over Carte to create jobs/transformations to be saved in the server-side repository?<br /><br />I just need the spoon's GUI only for defining jobs and transformations to store in the server side's repository so that I can run them through cron on the server-side using pan.sh or kitchen.sh. I was originally thinking installing XServer on the server side so that the spoon GUI can be forwarded to my client over ssh. However, since installing XServer (particularly on Amazon AWS) is not recommended due to the heavy use of resources, I want to know if Carte can be an alternative to the scenario like mine.<br /><br />Regards.<br />Daniel Daniel NWUhttps://www.blogger.com/profile/15196678050887822327noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-30862839479316615892012-05-05T10:08:30.453-07:002012-05-05T10:08:30.453-07:00Thank you. Will keep in touch.
Regards,
SunilThank you. Will keep in touch.<br /><br />Regards,<br />SunilSunilhttps://www.blogger.com/profile/15367045717460873043noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-29178809736382716122012-05-05T09:18:30.070-07:002012-05-05T09:18:30.070-07:00Sounds like a plan! As said, it's just like se...Sounds like a plan! As said, it's just like setting up something on a standard server. Once you know how to connect to your EC2 instance, it's just like setting PDI on a standard server. If you haven't worked with EC2 before, setting up the EC2 instances and connecting to them will be the big learning curve ... once you familiar with this, setting up PDI is easy. Good luck!Anonymoushttps://www.blogger.com/profile/05683544764949933581noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-39934552629933032122012-05-05T06:07:37.603-07:002012-05-05T06:07:37.603-07:00Hi Diethard,
I am also doing well, Thank you. Tha...Hi Diethard,<br /><br />I am also doing well, Thank you. Thanks a lot for your quick reply. I have that book with me, will go through it in detail. So from your inputs I assume that there won't be much challenges in having a Pentaho server in EC2. The concept will be something like we are going to have an ETL server(pentaho),a reporting server (not pentaho) and a Database Server in EC2.<br /><br />Regards,<br />Sunil George.Sunilhttps://www.blogger.com/profile/15367045717460873043noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-68982135004206897312012-05-05T01:06:21.432-07:002012-05-05T01:06:21.432-07:00Hi Sunil,
I am doing fine, thanks! How about you? ...Hi Sunil,<br />I am doing fine, thanks! How about you? Last time I used EC2 I just uploaded the PDI files to the server via sftp. It's just like a normal server to interact with, so you can use putty, ssh client etc. The PDI installation procedure is exactly the same: simple ... just put the files there in the right location. <br />I also recommend having a look at the "Pentaho Kettle Solutions" book which covers this topic as well. <br />Best regards,<br />DiethardAnonymoushttps://www.blogger.com/profile/05683544764949933581noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-20379069791251113752012-05-04T21:33:58.898-07:002012-05-04T21:33:58.898-07:00Hi Diethard,
Hope you are doing well. I would lik...Hi Diethard,<br /><br />Hope you are doing well. I would like to know if there is any post or documentation available regarding pentaho BI & DI installation and its best practices in an Amazon Cloud.<br /><br />Typically is it similar to an installation that we will normally have in a VM? Say if we have a linux VM available we can connect through putty and install the .bin version of pentaho BI suite. From pentaho website I read something called an Amazon EC2 image is available or something. How this is different from a normal installation. Does this depend on the type of package we take from Amazon, like option for installing our own softwares or preinstalled softwares.<br /><br />Many Thanks in Advance.<br /><br />Sunil George.Sunilhttps://www.blogger.com/profile/15367045717460873043noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-49041262981706820052012-05-03T02:56:41.388-07:002012-05-03T02:56:41.388-07:00Currently there doesn't seem to be a nice way ...Currently there doesn't seem to be a nice way to stop carte. If you are working on Unix or Linux you will have to use the kill command.Anonymoushttps://www.blogger.com/profile/05683544764949933581noreply@blogger.comtag:blogger.com,1999:blog-4527276399966185267.post-57532786756877266872012-05-03T02:53:52.254-07:002012-05-03T02:53:52.254-07:00How to stop carte?How to stop carte?Anonymoushttps://www.blogger.com/profile/02064135947889075066noreply@blogger.com