安装时碰到问题,识别不到ip地址,求教

35 views
Skip to first unread message

gemini1...@gmail.com

unread,
Mar 6, 2015, 3:42:12 AM3/6/15
to crawlzi...@googlegroups.com
我在VM中操作系统是Ubuntu12.04,安装 时候总是出现如下提示,导致安装不成功,请问如何解决,谢谢!

System does not has Crawlzilla.
Identify is root
Your system information is:
Ubuntu , 12.04
No depend package to install
System has Sun Java 1.6 above version.
System has ssh.
System has ssh Server (sshd).
System has dialog.
Welcome to use Crawlzilla, this install program will create a new accunt and to assist you to setup the password of crawler.
Set password for crawler:
password:

keyin the password again:
password:

Master IP address is:
Master MAC address is:
Please confirm the install infomation of above :1.Yes 2.No

Jazz Yao-Tsung Wang

unread,
Mar 7, 2015, 2:01:30 AM3/7/15
to crawlzi...@googlegroups.com
1. 可以請教您使用的 VM 是哪一種環境,有可能是因為無法識別網路卡裝置名稱( Ex. 並非 eth*) 造成
2. 請教您使用的 Crawlzilla 版本是哪個版本

Thanks ~

- Jazz

gemini1...@gmail.com

unread,
Mar 7, 2015, 8:24:02 PM3/7/15
to crawlzi...@googlegroups.com
谢谢Jazz的及时回复,非常感谢!
环境信息如下:
VMware WorkStation 8
Ubuntu12.04
Crawlzilla-1.1.2

在Ubuntu中ifconfig信息如下:
:~$ ifconfig
eth0      Link encap:Ethernet  HWaddr 00:0c:29:a9:67:d6  
          inet addr:192.168.1.105  Bcast:192.168.1.255  Mask:255.255.255.0
          inet6 addr: fe80::20c:29ff:fea9:67d6/64 Scope:Link
          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:73 errors:0 dropped:0 overruns:0 frame:0
          TX packets:130 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000 
          RX bytes:21573 (21.5 KB)  TX bytes:16352 (16.3 KB)
          Interrupt:19 Base address:0x2000 

lo        Link encap:Local Loopback  
          inet addr:127.0.0.1  Mask:255.0.0.0
          inet6 addr: ::1/128 Scope:Host
          UP LOOPBACK RUNNING  MTU:16436  Metric:1
          RX packets:158 errors:0 dropped:0 overruns:0 frame:0
          TX packets:158 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:0 
          RX bytes:18242 (18.2 KB)  TX bytes:18242 (18.2 KB)

请问是不是eth0这种都不认?如何解决?
谢谢!

Jazz Yao-Tsung Wang

unread,
Mar 7, 2015, 9:30:27 PM3/7/15
to crawlzi...@googlegroups.com
不是很確定您遇到的問題,可能跟安裝順序有關。

我在 Ubuntu 12.04 安裝並沒有遇到這個問題。
我製作了一個 OVF 檔,您可以先下載試試看。



這個 VM 設置了兩張網卡,一張 NAT,一張 Host-Only
Host-Only 那張綁 192.168.90.11

SSH 用戶 crawler 密碼 crawlzilla
WEB
Crawlzilla 團隊因為開發成員都已經換工作了,
目前就只剩下維護,沒有繼續做開發。

root@crawlzilla:~# ifconfig
eth0      Link encap:Ethernet  HWaddr 08:00:27:f3:ad:d4 
          inet addr:10.0.2.15  Bcast:10.0.2.255  Mask:255.255.255.0
          inet6 addr: fe80::a00:27ff:fef3:add4/64 Scope:Link

          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:187062 errors:0 dropped:0 overruns:0 frame:0
          TX packets:95073 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000
          RX bytes:257269064 (257.2 MB)  TX bytes:6124367 (6.1 MB)

eth1      Link encap:Ethernet  HWaddr 08:00:27:ab:83:51 
          inet addr:192.168.90.11  Bcast:192.168.90.255  Mask:255.255.255.0
          inet6 addr: fe80::a00:27ff:feab:8351/64 Scope:Link

          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:4094 errors:0 dropped:0 overruns:0 frame:0
          TX packets:12941 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000
          RX bytes:742020 (742.0 KB)  TX bytes:25125470 (25.1 MB)


lo        Link encap:Local Loopback 
          inet addr:127.0.0.1  Mask:255.0.0.0
          inet6 addr: ::1/128 Scope:Host
          UP LOOPBACK RUNNING  MTU:16436  Metric:1
          RX packets:3422 errors:0 dropped:0 overruns:0 frame:0
          TX packets:3422 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:0
          RX bytes:606405 (606.4 KB)  TX bytes:606405 (606.4 KB)

root@crawlzilla:/vagrant/Crawlzilla_Install# ./install
 System does not has Crawlzilla.
 Identify is root
 Your system information is:
 Ubuntu , 12.04
 Try to automatically install:   dialog expect
 
 Ubuntu will install some packages   dialog expect
Reading package lists... Done
Building dependency tree      
Reading state information... Done
The following extra packages will be installed:
  tcl8.5
Suggested packages:
  tclreadline
The following NEW packages will be installed:
  dialog expect tcl8.5
0 upgraded, 3 newly installed, 0 to remove and 158 not upgraded.
Need to get 1,541 kB of archives.
After this operation, 5,374 kB of additional disk space will be used.
Get:1 http://free.nchc.org.tw/ubuntu/ precise/universe dialog amd64 1.1-20111020-1 [280 kB]
Get:2 http://free.nchc.org.tw/ubuntu/ precise/main tcl8.5 amd64 8.5.11-1ubuntu1 [1,098 kB]
Get:3 http://free.nchc.org.tw/ubuntu/ precise/main expect amd64 5.45-2 [163 kB]
Fetched 1,541 kB in 4s (364 kB/s)
Selecting previously unselected package dialog.
(Reading database ... 64595 files and directories currently installed.)
Unpacking dialog (from .../dialog_1.1-20111020-1_amd64.deb) ...
Selecting previously unselected package tcl8.5.
Unpacking tcl8.5 (from .../tcl8.5_8.5.11-1ubuntu1_amd64.deb) ...
Selecting previously unselected package expect.
Unpacking expect (from .../expect_5.45-2_amd64.deb) ...
Processing triggers for man-db ...
Setting up dialog (1.1-20111020-1) ...
Setting up tcl8.5 (8.5.11-1ubuntu1) ...
update-alternatives: using /usr/bin/tclsh8.5 to provide /usr/bin/tclsh (tclsh) in auto mode.
Setting up expect (5.45-2) ...
Processing triggers for libc-bin ...
ldconfig deferred processing now taking place
./install: line 480: bc: command not found

 System has Sun Java 1.6 above version.
 System has ssh.
 System has ssh Server (sshd).
 System has dialog.
 Welcome to use Crawlzilla, this install program will create a new accunt and to assist you to setup the password of crawler.
 Set password for crawler:
password:

 keyin the password again:
password:

 System detect network cards as follows:
 (1)  eth0  10.0.2.15
 (2)  eth1  192.168.90.11
 Please choose a netword card for Crawlzilla web server. Use (1/2/3):
2
./install: line 636: shoe_info: command not found
 Your choose is: 2
 Master IP address is: 192.168.90.11
 Master MAC address is:  08:00:27:ab:83:51  
 Please confirm the install infomation of above :1.Yes 2.No 
1
 Create crawler and change password.
spawn passwd crawler
Enter new UNIX password:
Retype new UNIX password:
passwd: password updated successfully
Generating public/private rsa key pair.
Created directory '/home/crawler/.ssh'.
Your identification has been saved in /home/crawler/.ssh/id_rsa.
Your public key has been saved in /home/crawler/.ssh/id_rsa.pub.
The key fingerprint is:
38:52:78:f9:4e:b0:b7:a1:44:20:e0:c3:fa:63:64:27 crawler@crawlzilla
The key's randomart image is:
+--[ RSA 2048]----+
|... .            |
|o  . o .         |
| +  . *          |
|. .  + =         |
|. E o = S        |
| + o o * o       |
|  +   . o        |
| . .             |
|                 |
+-----------------+
Could not open a connection to your authentication agent.
--2015-03-08 01:44:10--  http://sourceforge.net/projects/crawlzilla/files/stable/package/crawlzilla-1.2.1pack.tar.gz/download
Resolving sourceforge.net (sourceforge.net)... 216.34.181.60
Connecting to sourceforge.net (sourceforge.net)|216.34.181.60|:80... connected.
HTTP request sent, awaiting response... 302 Found
Location: http://downloads.sourceforge.net/project/crawlzilla/stable/package/crawlzilla-1.2.1pack.tar.gz?r=&ts=1425779050&use_mirror=nchc [following]
--2015-03-08 01:44:10--  http://downloads.sourceforge.net/project/crawlzilla/stable/package/crawlzilla-1.2.1pack.tar.gz?r=&ts=1425779050&use_mirror=nchc
Resolving downloads.sourceforge.net (downloads.sourceforge.net)... 216.34.181.59
Connecting to downloads.sourceforge.net (downloads.sourceforge.net)|216.34.181.59|:80... connected.
HTTP request sent, awaiting response... 302 Found
Location: http://nchc.dl.sourceforge.net/project/crawlzilla/stable/package/crawlzilla-1.2.1pack.tar.gz [following]
--2015-03-08 01:44:11--  http://nchc.dl.sourceforge.net/project/crawlzilla/stable/package/crawlzilla-1.2.1pack.tar.gz
Resolving nchc.dl.sourceforge.net (nchc.dl.sourceforge.net)... 211.79.60.17, 2001:e10:ffff:1f02::17
Connecting to nchc.dl.sourceforge.net (nchc.dl.sourceforge.net)|211.79.60.17|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 147726322 (141M) [application/x-gzip]
Saving to: `crawlzilla-1.2.1pack.tar.gz'

11% [===========>                                                                                            ] 17,699,915   364K/s  eta 6m 0s   ^100%[=======================================================================================================>] 147,726,322  368K/s   in 6m 44s 

2015-03-08 01:50:55 (357 KB/s) - `crawlzilla-1.2.1pack.tar.gz' saved [147726322/147726322]

 unpack success!
  Check and Set /etc/hosts finished.
 Add Crawlzilla service to /etc/init.d
 Crawlzilla will startup when booting
 Adding system startup for /etc/init.d/crawlzilla ...
   /etc/rc0.d/K20crawlzilla -> ../init.d/crawlzilla
   /etc/rc1.d/K20crawlzilla -> ../init.d/crawlzilla
   /etc/rc6.d/K20crawlzilla -> ../init.d/crawlzilla
   /etc/rc2.d/S20crawlzilla -> ../init.d/crawlzilla
   /etc/rc3.d/S20crawlzilla -> ../init.d/crawlzilla
   /etc/rc4.d/S20crawlzilla -> ../init.d/crawlzilla
   /etc/rc5.d/S20crawlzilla -> ../init.d/crawlzilla
 Make the slave installation package 
 Formatting HDFS...
 start up name node [Namenode] ... 
starting namenode, logging to /var/log/crawlzilla/hadoop-logs/hadoop-crawler-namenode-crawlzilla.out
 start up job node [JobTracker] ... 
starting jobtracker, logging to /var/log/crawlzilla/hadoop-logs/hadoop-crawler-jobtracker-crawlzilla.out
starting datanode, logging to /var/log/crawlzilla/hadoop-logs/hadoop-crawler-datanode-crawlzilla.out
starting tasktracker, logging to /var/log/crawlzilla/hadoop-logs/hadoop-crawler-tasktracker-crawlzilla.out
 Start up tomcat...
.....
 Tomcat may not start, please use " crawlzilla " to start
 Installed successfully!
 You can visit the manage website :http://192.168.90.11:8080
 Finish!!!

gemini1...@gmail.com

unread,
Mar 7, 2015, 9:56:13 PM3/7/15
to crawlzi...@googlegroups.com
谢谢您的回复,非常感谢!
请问在哪可以下载OVF 檔?之前的链接打不开,谢谢!

Jazz Yao-Tsung Wang

unread,
Mar 7, 2015, 9:57:05 PM3/7/15
to crawlzi...@googlegroups.com
我製作了一個 OVF 檔,上傳中,可以在以下路徑取得:

http://hadoopcon.org/OVA/


這個 VM 設置了兩張網卡,一張 NAT,一張 Host-Only
Host-Only 那張綁 192.168.90.11
所以若您的 VMWare 設置正確,
連線至 http://192.168.90.11:8080 應該就可以看到網頁介面了。


SSH 用戶 crawler 密碼 crawlzilla
WEB 用戶 admin 密碼 crawlzilla  

- Jazz
謝謝Jazz的及時回覆,非常感謝!
環境信息如下:
VMware WorkStation 8
Ubuntu12.04
Crawlzilla-1.1.2

在Ubuntu中ifconfig信息如下:
:~$ ifconfig
eth0      Link encap:Ethernet  HWaddr 00:0c:29:a9:67:d6  
          inet addr:192.168.1.105  Bcast:192.168.1.255  Mask:255.255.255.0
          inet6 addr: fe80::20c:29ff:fea9:67d6/64 Scope:Link
          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:73 errors:0 dropped:0 overruns:0 frame:0
          TX packets:130 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000 
          RX bytes:21573 (21.5 KB)  TX bytes:16352 (16.3 KB)
          Interrupt:19 Base address:0x2000 

lo        Link encap:Local Loopback  
          inet addr:127.0.0.1  Mask:255.0.0.0
          inet6 addr: ::1/128 Scope:Host
          UP LOOPBACK RUNNING  MTU:16436  Metric:1
          RX packets:158 errors:0 dropped:0 overruns:0 frame:0
          TX packets:158 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:0 
          RX bytes:18242 (18.2 KB)  TX bytes:18242 (18.2 KB)

請問是不是eth0這種都不認?如何解決?
謝謝!


On Saturday, March 7, 2015 at 3:01:30 PM UTC+8, Jazz Yao-Tsung Wang wrote:
1. 可以請教您使用的 VM 是哪一種環境,有可能是因為無法識別網路卡裝置名稱( Ex. 並非 eth*) 造成
2. 請教您使用的 Crawlzilla 版本是哪個版本

Thanks ~

- Jazz

On Friday, March 6, 2015 at 4:42:12 PM UTC+8, gemini1...@gmail.com wrote:
我在VM中操作系統是Ubuntu12.04,安裝 時候總是出現如下提示,導致安裝不成功,請問如何解決,謝謝!

gemini1...@gmail.com

unread,
Mar 11, 2015, 9:23:46 AM3/11/15
to crawlzi...@googlegroups.com

下载时报错,提示我没有访问权限

Forbidden

You don't have permission to access /OVA/Crawlzilla.ova on this server.


Apache/2.2.22 (Debian) Server at hadoopcon.org Port 80

Jazz Yao-Tsung Wang

unread,
Mar 11, 2015, 9:25:07 AM3/11/15
to crawlzi...@googlegroups.com
已修正權限。


On Wednesday, March 11, 2015 at 9:23:46 PM UTC+8, gemini1...@gmail.com wrote:

下载时报错,提示我没有访问权限
Forbidden

You don't have permission to access /OVA/Crawlzilla.ova on this server.Apache/2.2.22 (Debian) Server at hadoopcon.org Port 80
Reply all
Reply to author
Forward
0 new messages