Re: [Hadoop-studio-users] Hadoop-studio-users Digest, Vol 13, Issue 4

1 view
Skip to first unread message

papu bhattacharya

unread,
Aug 25, 2010, 10:05:11 AM8/25/10
to hadoop-st...@lists.sourceforge.net
Guys,
I would be really grateful if somebody could give some pointer on heap issue in Netbeans.
I was having a impression that my first example in netbeans would run and i can see only one usecase of studio in action.
Any help would be really appreciated.
Papu 

On Tue, Aug 24, 2010 at 8:37 PM, <hadoop-studio...@lists.sourceforge.net> wrote:
Send Hadoop-studio-users mailing list submissions to
       hadoop-st...@lists.sourceforge.net

To subscribe or unsubscribe via the World Wide Web, visit
       https://lists.sourceforge.net/lists/listinfo/hadoop-studio-users
or, via email, send a message with subject or body 'help' to
       hadoop-studio...@lists.sourceforge.net

You can reach the person managing the list at
       hadoop-studi...@lists.sourceforge.net

When replying, please edit your Subject line so it is more specific
than "Re: Contents of Hadoop-studio-users digest..."


Today's Topics:

  1. Re: Help deploying basic job using in-process cluster on
     Windows 7 (D J Stewart)
  2. Re: Help deploying basic job using in-process     cluster on
     Windows 7 (Hadooper)
  3. Re: Hadoop-studio-users Digest, Vol 13,   Issue 3
     (papu bhattacharya)


----------------------------------------------------------------------

Message: 1
Date: Tue, 24 Aug 2010 14:41:36 +0100 (BST)
From: D J Stewart <da...@karmasphere.com>
Subject: Re: [Hadoop-studio-users] Help deploying basic job using
       in-process cluster on Windows 7
To: hadoop-st...@lists.sourceforge.net
Message-ID: <alpine.DEB.1.10.1...@lithium.dhcpdomain>
Content-Type: TEXT/PLAIN; charset=US-ASCII; format=flowed

On Tue, 24 Aug 2010, Hadooper wrote:

> I have cygwin installed already.  Do you happen to know what steps are
> required to "link" that to the in-process versions of Hadoop running
> through the Karmasphere plugin for Eclipse?

You should just need to add cygwin's "bin" directory to Windows' PATH:

    *  Control Panel -> System and Security -> System ->
       Advanced System Settings
    *  Click on the "Advanced" tab
    *  Click "Environment Variables" button
    *  Select "Path", click "Edit"
    *  Add ";C:\cygwin\bin" to the end of the "variable value"

--
Dave Stewart



------------------------------

Message: 2
Date: Tue, 24 Aug 2010 07:24:18 -0700 (PDT)
From: Hadooper <hadoo...@gmail.com>
Subject: Re: [Hadoop-studio-users] Help deploying basic job using
       in-process      cluster on Windows 7
To: hadoop-st...@lists.sourceforge.net
Message-ID:
       <e5c8054f-f471-43e6...@s9g2000yqd.googlegroups.com>
Content-Type: text/plain; charset=ISO-8859-1

Thank you for help.  Adding cygwin's bin directory to the path fixed
the problem.



On Aug 24, 9:41?am, D J Stewart <d...@karmasphere.com> wrote:
> On Tue, 24 Aug 2010, Hadooper wrote:
> > I have cygwin installed already. ?Do you happen to know what steps are
> > required to "link" that to the in-process versions of Hadoop running
> > through the Karmasphere plugin for Eclipse?
>
> You should just need to add cygwin's "bin" directory to Windows' PATH:
>
> ? ? ?* ?Control Panel -> System and Security -> System ->
> ? ? ? ? Advanced System Settings
> ? ? ?* ?Click on the "Advanced" tab
> ? ? ?* ?Click "Environment Variables" button
> ? ? ?* ?Select "Path", click "Edit"
> ? ? ?* ?Add ";C:\cygwin\bin" to the end of the "variable value"
>
> --
> Dave Stewart
>
> ------------------------------------------------------------------------------
> Sell apps to millions through the Intel(R) Atom(Tm) Developer Program
> Be part of this innovative community and reach millions of netbook users
> worldwide. Take advantage of special opportunities to increase revenue and
> speed time-to-market. Join now, and jumpstart your future.http://p.sf.net/sfu/intel-atom-d2d
> _______________________________________________
> Hadoop-studio-users mailing list
> Hadoop-studio-us...@lists.sourceforge.nethttps://lists.sourceforge.net/lists/listinfo/hadoop-studio-users



------------------------------

Message: 3
Date: Tue, 24 Aug 2010 20:37:48 +0530
From: papu bhattacharya <papu...@gmail.com>
Subject: Re: [Hadoop-studio-users] Hadoop-studio-users Digest, Vol 13,
       Issue 3
To: hadoop-st...@lists.sourceforge.net
Message-ID:
       <AANLkTi=W-AgnXFBexgqu78sPj...@mail.gmail.com>
Content-Type: text/plain; charset="iso-8859-1"

Hi,
I did not set anything in studio. I just installed the plugin.
How to set the map output buffer in netbeans?
I wonder where i can get the memory parameter for sub jvm..
I tested in different m/c RHEL(5) with Netbeans 6.8 and 6.9 and result is
same.
Would be a great help if u could please pass a pointer.
Papu

On Tue, Aug 24, 2010 at 6:58 PM, <
hadoop-studio...@lists.sourceforge.net> wrote:

> Send Hadoop-studio-users mailing list submissions to
>        hadoop-st...@lists.sourceforge.net
>
> To subscribe or unsubscribe via the World Wide Web, visit
>        https://lists.sourceforge.net/lists/listinfo/hadoop-studio-users
> or, via email, send a message with subject or body 'help' to
>        hadoop-studio...@lists.sourceforge.net
>
> You can reach the person managing the list at
>        hadoop-studi...@lists.sourceforge.net
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Hadoop-studio-users digest..."
>
>
> Today's Topics:
>
>   1. Re: java.lang.OutOfMemoryError: Java heap space while running
>      simple job in netbeans(mc is high end having plenty of memory)
>      (Shevek)
>   2. Re: Help deploying basic job using in-process cluster on
>      Windows 7 (Shevek)
>   3. Re: Help deploying basic job using in-process     cluster on
>      Windows 7 (Hadooper)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Tue, 24 Aug 2010 13:16:53 +0100
> From: Shevek <she...@karmasphere.com>
> Subject: Re: [Hadoop-studio-users] java.lang.OutOfMemoryError: Java
>        heap space while running simple job in netbeans(mc is high end
> having
>        plenty of memory)
> To: hadoop-st...@lists.sourceforge.net
> Message-ID: <1282652213.8091.1283.camel@noir>
> Content-Type: text/plain; charset="UTF-8"
>
> On Tue, 2010-08-24 at 13:37 +0530, papu bhattacharya wrote:
> > Hello.. I?m testing simple WordCount code with Netbeans plugin.
> >
> >
> > I?m facing some memory error everytime.
> > I?m running it on RHEL5
> > I?m using hadoop 0.20.2
> >
> > My m/c has 4 GB RAM. I have allocated netbeans almost 3 GB.
>
> It's not NetBeans that's running out of memory. It's the child JVM
> that's running the Hadoop-local job.
>
> Have you set the map output buffer to something inordinately large?
> That's where it died.
>
> I'll have to remind myself of some details to work out where the memory
> parameters for the sub-jvm come from.
>
> > 10/08/24 17:33:14 INFO mapred.MapTask: numReduceTasks: 1
> > 10/08/24 17:33:14 INFO mapred.MapTask: io.sort.mb = 100
> > 10/08/24 17:33:14 WARN mapred.LocalJobRunner: job_local_0001
> > java.lang.OutOfMemoryError: Java heap space
> >         at org.apache.hadoop.mapred.MapTask
> > $MapOutputBuffer.<init>(MapTask.java:781)
> >         at
> > org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:350)
> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
> >         at org.apache.hadoop.mapred.LocalJobRunner
> > $Job.run(LocalJobRunner.java:177)
> > 10/08/24 17:33:14 INFO mapred.JobClient:  map 0% reduce 0%
> > 10/08/24 17:33:14 INFO mapred.JobClient: Job complete: job_local_0001
> > 10/08/24 17:33:14 INFO mapred.JobClient: Counters: 0
> > Exception in thread "main" java.io.IOException: Job failed!
> >         at
> > org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1252)
> >         at WordCount.main(WordCount.java:52)
>
> S.
>
> --
> http://www.karmasphere.com/
> Karmasphere Studio for Hadoop - An intuitive visual interface to Big Data
>
>
>
>
> ------------------------------
>
> Message: 2
> Date: Tue, 24 Aug 2010 13:36:40 +0100
> From: Shevek <she...@karmasphere.com>
> Subject: Re: [Hadoop-studio-users] Help deploying basic job using
>        in-process cluster on Windows 7
> To: hadoop-st...@lists.sourceforge.net
> Message-ID: <1282653400.8091.1302.camel@noir>
> Content-Type: text/plain; charset="ANSI_X3.4-1968"
>
> On Mon, 2010-08-23 at 13:56 -0700, Hadooper wrote:
> > Hello all,
> >
> > I just got up and running with Karmasphere in Eclipse on a 64bit
> > Windows 7 box.  I tried creating a basic HadoopJob using the Hadoop
> > Map-Reduce Job (Karmasphere API).  I added an input file and all the
> > tabs (input, mapper, partitioner, comparator, combiner, reducer,
> > output) in the workflow view populate correctly.  However, when I try
> > to deploy the job to an in-process cluster I get the following error:
> >
> > Exception in thread "main" java.io.IOException: Cannot run program
> > "chmod":  CreateProcess error=2, The system cannot find the file
> > specified
> >      at java.lang.ProcessBuilder.start(Unknown Source) ....
> >
> > I've tried both Hadoop 0.18.3 and Hadoop 0.20.2 and I get the same
> > error regardless of the version.
> >
> > Does anyone have any ideas what the problem might be?  Do I need to be
> > using cygwin somehow to be able to deploy to an in-process cluster on
> > a windows machine?
>
> I think, and while I don't run Windows, we have colleagues who do, the
> answer is "yes", you do need cygwin installed to use a local thread on
> windows. This is a bug in Hadoop itself, not Karmasphere Studio.
>
> S.
>
> --
> http://www.karmasphere.com/
> Karmasphere Studio for Hadoop - An intuitive visual interface to Big Data
>
>
>
>
> ------------------------------
>
> Message: 3
> Date: Tue, 24 Aug 2010 06:28:37 -0700 (PDT)
> From: Hadooper <hadoo...@gmail.com>
> Subject: Re: [Hadoop-studio-users] Help deploying basic job using
>        in-process      cluster on Windows 7
> To: hadoop-st...@lists.sourceforge.net
> Message-ID:
>        <36b73c2d-0858-4466...@w30g2000yqw.googlegroups.com>
> Content-Type: text/plain; charset=ISO-8859-1
>
> I have cygwin installed already.  Do you happen to know what steps are
> required to "link" that to the in-process versions of Hadoop running
> through the Karmasphere plugin for Eclipse?
>
> Thank you for the response!
>
>
> On Aug 24, 8:36?am, Shevek <she...@karmasphere.com> wrote:
> > On Mon, 2010-08-23 at 13:56 -0700, Hadooper wrote:
> > > Hello all,
> >
> > > I just got up and running with Karmasphere in Eclipse on a 64bit
> > > Windows 7 box. ?I tried creating a basic HadoopJob using the Hadoop
> > > Map-Reduce Job (Karmasphere API). ?I added an input file and all the
> > > tabs (input, mapper, partitioner, comparator, combiner, reducer,
> > > output) in the workflow view populate correctly. ?However, when I try
> > > to deploy the job to an in-process cluster I get the following error:
> >
> > > Exception in thread "main" java.io.IOException: Cannot run program
> > > "chmod": ?CreateProcess error=2, The system cannot find the file
> > > specified
> > > ? ? ?at java.lang.ProcessBuilder.start(Unknown Source) ....
> >
> > > I've tried both Hadoop 0.18.3 and Hadoop 0.20.2 and I get the same
> > > error regardless of the version.
> >
> > > Does anyone have any ideas what the problem might be? ?Do I need to be
> > > using cygwin somehow to be able to deploy to an in-process cluster on
> > > a windows machine?
> >
> > I think, and while I don't run Windows, we have colleagues who do, the
> > answer is "yes", you do need cygwin installed to use a local thread on
> > windows. This is a bug in Hadoop itself, not Karmasphere Studio.
> >
> > S.
> >
> > --http://www.karmasphere.com/
> > Karmasphere Studio for Hadoop - An intuitive visual interface to Big Data
> >
> >
> ------------------------------------------------------------------------------
> > Sell apps to millions through the Intel(R) Atom(Tm) Developer Program
> > Be part of this innovative community and reach millions of netbook users
> > worldwide. Take advantage of special opportunities to increase revenue
> and
> > speed time-to-market. Join now, and jumpstart your future.
> http://p.sf.net/sfu/intel-atom-d2d
> > _______________________________________________
> > Hadoop-studio-users mailing list
> > Hadoop-studio-us...@lists.sourceforge.nethttps://
> lists.sourceforge.net/lists/listinfo/hadoop-studio-users
>
>
>
> ------------------------------
>
>
> ------------------------------------------------------------------------------
> Sell apps to millions through the Intel(R) Atom(Tm) Developer Program
> Be part of this innovative community and reach millions of netbook users
> worldwide. Take advantage of special opportunities to increase revenue and
> speed time-to-market. Join now, and jumpstart your future.
> http://p.sf.net/sfu/intel-atom-d2d
>
> ------------------------------
>
> _______________________________________________
> Hadoop-studio-users mailing list
> Hadoop-st...@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/hadoop-studio-users
>
>
> End of Hadoop-studio-users Digest, Vol 13, Issue 3
> **************************************************
>
-------------- next part --------------
An HTML attachment was scrubbed...

------------------------------

------------------------------------------------------------------------------
Sell apps to millions through the Intel(R) Atom(Tm) Developer Program
Be part of this innovative community and reach millions of netbook users
worldwide. Take advantage of special opportunities to increase revenue and
speed time-to-market. Join now, and jumpstart your future.
http://p.sf.net/sfu/intel-atom-d2d

------------------------------

_______________________________________________
Hadoop-studio-users mailing list
Hadoop-st...@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/hadoop-studio-users


End of Hadoop-studio-users Digest, Vol 13, Issue 4
**************************************************

Reply all
Reply to author
Forward
0 new messages